Insights YouTube and Cyando—Advocate General Unpicks CJEU’s Case Law on Communication to the Public

Contact

This article was first published in Entertainment Law Review on 13 October 2020.

On 16 July 2020, Advocate General Saugmandsgaard Øe’s Opinion in YouTube and Cyando was published.1 It is an ambitious attempt to re-think EU copyright law and platform regulation in 256 paragraphs.2

The Opinion is handed down against the backdrop of mounting pressure on both sides of the Atlantic to strengthen platform regulation—in the EU, specifically, the ongoing Digital Services Act process, as well as the implementation in EU Member States of the Directive on Copyright in the Digital Single Market,3 art.17 of which goes to the liability of platforms such as YouTube for copyright infringement.

Background

The case concerns the two platforms YouTube and Cyando, the latter which is the owner of the cyberlocker Uploaded.

In the YouTube case, videos in which Sarah Brightman recordings were incorporated were uploaded on YouTube. Following notification by the record producer, Frank Peterson, the videos were taken down, but the videos re-appeared and Peterson brought an action against YouTube.

The Cyando case concerns the cyberlocker Uploaded, which permits users to upload content and provides a functionality whereby users can share a link to that content. Elsevier, the publisher, notified Cyando that three e-books were accessible on Uploaded’s servers via links on third party websites. Elsevier brought an action against Cyando.

The German Federal Court of Justice made separate references in respect of the two cases, concerning —broadly—the three following questions:

  • whether the platforms infringe copyright by communicating to the public, under the 2001 Copyright Directive art.34;
  • whether the platforms are eligible for the so-called hosting safe harbour, in the E-Commerce Directive art.145; and
  • whether the rights holders are able to obtain injunctions against the platforms, pursuant to the 2001 Copyright Directive art.8(3).

In view of the similarity of the cases, the CJEU joined the cases and a hearing took place before its Grand Chamber on 26 November 2019.

This comment will briefly consider the two first questions, aware that it is difficult to do this lengthy Opinion justice in a brief comment.

At the outset, it is noted that this is an Opinion informed by a strong consumer and internet user perspective—arguably to the detriment of the text of the relevant Directives and the CJEU’s case law.6

Communication to the public

The 2001 Copyright Directive itself is short on detail on the exclusive right of communication to the public, including making available, provided for under the 2001 Copyright Directive art.3. It requires: (i) a communication; and (ii) that that communication is to a public. The CJEU, however, has since the 2006 case SGAE7 elaborated a set of criteria for determining whether an act constitutes a communication to the public.

On the question of communication to the public, Advocate General Saugmandsgaard Øe recommends that the CJEU finds that neither YouTube nor Cyando communicates to the public, only the users uploading unauthorised content do.

In this part of his Opinion, the Advocate General invites the court to revisit its case law on art.3.8 In particular, he explicitly rejects the principles established in the GS Media,Filmspeler and The Pirate Bay cases.9 In all these cases, the CJEU has applied a knowledge criterion as part of the test of whether an intervention constitutes a communication to the public.

By way of illustration, in GS Media, the posting of links to unauthorised content was held to constitute a communication to the public where the person doing so knew or ought to have known that the links provided access to unauthorised works; and such knowledge is presumed where the person posting the links pursues financial gain.10 The court applied similar reasoning in the Filmspeler and The Pirate Bay cases.

While the CJEU’s approach has occasionally been criticised by copyright purists for blurring the distinction between direct and indirect liability, it is a pragmatic one and one which ensures the law remains relevant and applicable to the technological reality of today.

In his Opinion in The Pirate Bay, Advocate General Szpunar put it thus:

“The European Commission, whose opinion appears to me to be shared by the United Kingdom of Great Britain and Northern Ireland, contends that liability for sites of this type is a matter of copyright application, which can be resolved not at the level of EU law but under the domestic legal systems of the Member States. Such an approach would, however, mean that liability, and ultimately the scope of the copyright holders’ rights, would depend on the very divergent solutions adopted under the different national legal systems. That would undermine the objective of EU legislation in the relatively abundant field of copyright, which is precisely to harmonise the scope of the rights enjoyed by authors and other rightholders within the single market. That is why the answer to the problems raised in the present case must, in my view, be sought rather in EU law.”11

Advocate General Saugmandsgaard Øe, however, is of the view that this approach is unsupported by the text of the Directive and therefore must be rejected, together with a long line of judgments. Instead, the Opinion offers its own construction of EU copyright law, resting on a very clear differentiation between notions of direct and secondary liability.

Direct infringement

The Opinion thus first considers whether the platforms are directly liable for infringement of the communication to the public right. While acknowledging that platforms play a crucial role in communicating content to the public, the Advocate General’s reasoning appears based on the premise that only one person can be liable for communication to the public, the person playing the “more fundamental” role:

“In absolute terms, any intermediary plays an important, or even crucial, role in that transmission, as it is one of the links in the chain making it possible. However, the role played by the person in question is more fundamental. The role is ‘essential’ because it is that person who decides to transmit a given work to a public and who actively initiates that ‘communication’.”12

It is noted that there is nothing in the 2001 Copyright Directive which supports the Advocate General’s requirement that only one person can be liable for communication to the public; nor is there anything in the CJEU’s case law.

In addition, the Opinion is informed by a narrow reading of Recital 27 of the 2001 Copyright Directive, which provides that a service provider does not carry out a communication to the public provided it simply engages in the mere provision of physical facilities. The Advocate General takes the view that this does not mean that the provider cannot optimise access to the content transmitted by organising its service—the physical facilities themselves do not need to be “mere”, it is the provision of them that must be mere.13

Consequently, the Advocate General concludes that the platforms cannot be held directly liable, although he acknowledges the possibility of some form of secondary liability on the part of the platform operators, under national law concepts of secondary liability.

Secondary liability

The Advocate General nevertheless considers the alternative position, that is, were the CJEU to apply itsown case law to the cases at hand—or at least he attempts to do so. He reaches the conclusion that even on the basis of the principles developed by the court, YouTube is not liable for communication to the public, whereas matters are a little less clear cut when it comes to Cyando.

It is evident that the Advocate General is uncomfortable applying the case law he has dismissed. In terms of the knowledge criterion, he states as follows:

“The manner in which this criterion is to be interpreted in the present cases is much less clear. The problem arises because there is no framework in EU law relating to this mental element. I can therefore only speculate …”14

Indeed, the Advocate General is at pains to limit the applicability of the court’s previous case law and questions its applicability to the cases at hand, going so far as to suggest that the approach taken by the court in the GS Media case is confined to links:

“It is true that in GS Media the Court ruled that when the person who posts on a website hyperlinks to protected works published without the authorisation of their author on another website does so for the purpose of making a profit, it must be presumed (subject to rebuttal) that that person had knowledge of the protected nature of those works and of that lack of authorisation. However, aside from the fact that, in its subsequent case law, the Court seems to have confined this approach to hyperlinks, I think that, in any event, this presumption cannot be applied in this [sic] present cases.”15

However, the Advocate General takes the view that the CJEU’s case law permits holding a service provider liable where it has the intention in providing its services, to facilitate third party infringement, even in the absence of knowledge of specific infringements.16

In applying this view, he concludes that an operator cannot be held liable merely because the structure of its platform enables users to publish content by an automated process and it does not check the compliance of that content with the law prior to uploading it. This begs the question—why not?

The Advocate General’s response is one of running the test for art.3 together with the E-Commerce Directive art.15:

“the provider again cannot be expected, in accordance with Article 15(1) of Directive 2000/31, to monitor in a general manner all the files which users of its service intend to publish before they are uploaded.”17

It is not clear that this conflation of the test for infringement and the separate prohibition of a general obligation to monitor information is correct.

The Opinion acknowledges that a particular feature of Cyando, the partnership programme, whereby Cyandoremunerates users based on the number of downloads of files uploaded by them, may lead to a view of a “deliberate” nature of a provider’s intervention in illegal acts of users and thus infringement of art.33.18

Hosting safe harbour

The E-Commerce Directive art.14 provides that a hosting provider is not liable for the information stored at the request on its users, on two conditions: (i) the provider does not have actual knowledge of illegal activity or information and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or information is apparent; and (ii) the provider acts expeditiously to remove information upon notification.

Drawing on Recital 42 of the Directive, the CJEU has, in the eBay case,19 additionally found that only a passive hosting provider can benefit from this liability limitation; a service provider which plays an active role cannot do so.

The Advocate General’s two key conclusions in respect of YouTube and Cyando are that:

  • the platforms are not active in respect of the information they store and can thus benefit from the safe harbour in respect of all liability resulting from files stored at the request of their users; and
  • the phrases “actual knowledge” and “awareness of facts or circumstances from which illegal activity is apparent” refer to “specific illegal information”.

Active role

In keeping with his approach to art.3, the Advocate General’s conclusions are informed by a narrow interpretation of what playing an active role entails. In his view, an active role requires that a hosting provider acquires “intellectual control of [the] content”.20 Apart from being an entirely new criterion (ironically developed by an Advocate General otherwise at pains to emphasise the need for strict textual interpretation of legislation), it sets a very high threshold. The Advocate General considers this met, if the provider:

  • selects the stored information;
  • is actively involved in the content of that information in some other way; or
  • presents that information to the public in such a way that it appears to be its own.

The Advocate General notes: “In those circumstances, the provider goes outside of the role of an intermediary for information provided by users of its service: it appropriates that information”.21 It is true that these criteria bring a provider outside the role of a passive intermediary; these are actions more often associated with traditional media providers, such as broadcasters and SVOD providers, who license in content and present it in their own channels or user interface. The threshold seems to be so high as to render the “active role” condition meaningless.

Other means of control, such as controlling access to the information,22 structuring the presentation of videos or providing a search function,23 or making recommendations to the user of content she may wish to consume,24 do not suffice to demonstrate an active role.

Not wholly unlike the Advocate General’s rejection of the CJEU’s case law under art.3, he invites the court to re-think the approach to “active role”, which, at least, many informed observers had assumed was the correct one:

“[T]o my mind, it is immaterial that a provider structures how the information provided by the users of its service is presented on its platform or on its website in order to facilitate its use and thus optimise access to that information. I think that the argument to the contrary put forward by Mr Peterson and the French Government in particular reflects a misunderstanding of the judgment in L’Oréal v eBay. Although the Court ruled in that judgment that a provider such as eBay plays an ‘active role’ where it provides assistance to certain sellers, in the case of certain offers for sale, which entails ‘optimising the presentation [of those offers]’, the Court had in view the fact that eBay sometimes provides individual assistance on how to optimise, exploit and structure the content of specificoffers. By providing such assistance, eBay is actively involved in the content of the offers in question, as envisaged in point 152 of this Opinion.”25 (emphasis in original)

In addition, the Advocate General appears to take the view that any function that is “automated” automatically implies that the platform is passive in relation to the information processed.26 That is, the choice by a platform operator to automate certain functions and to design algorithms to enable automation results in it being able to avail itself of the safe harbour.

On this basis, it is unsurprising that the Advocate General takes the view that YouTube and Cyando do not play an active role.

Knowledge and awareness

It is recalled that in order to avail itself of the safe harbour, a hosting provider must not have actual knowledge of illegal activity or information and, as regards claims for damages, not be aware of facts or circumstances from which the illegal activity or information is apparent.

In the eBay judgment, the CJEU established, in respect of the awareness test, that:

“[I]t is sufficient, in order for the provider of an information society service to be denied entitlement to the exemption from liability provided for in Article 14 of Directive 2000/31, for it to have been aware of facts or circumstances on the basis of which a diligent economic operator should have identified the illegality in question and acted in accordance with Article 14(1)(b) of Directive 2000/31.

Moreover, if the rules set out in Article 14(1)(a) of Directive 2000/31 are not to be rendered redundant, they must be interpreted as covering every situation in which the provider concerned becomes aware, in one way or another, of such facts or circumstances.”27

The Advocate General takes the view that the knowledge and awareness standards required are ones relating to specific illegal information. This, he suggests, can only be provided by means of a very specific notification. The Advocate General attempts to reconcile this with the eBay judgment, by stating that what the court had in mind was that a provider is obliged diligently to process facts and circumstances brought to its knowledge, in particular by notifications, concerning specific illegal information. Requiring a service provider to actively seek facts or circumstances would not be compatible with either art.14 or art.15, which prohibits Member States from imposing on hosting providers a general obligation to monitor the information stored. It is noted, however, that imposing specific monitoring obligations is not prohibited nor are platforms themselves prohibited from voluntarily monitoring their service. Indeed, platforms do comprehensively monitor their services—not only for the purpose of detecting illegal or unauthorised content, but for the purpose of harvesting user data for profit-making purposes. It is consequently unclear that the Advocate General’s recommendation sits well either with the court’s case law or with technological reality.

Nonetheless, it follows, in the Advocate General’s view, that each instance of infringement must be separately notified to the platform; the platform cannot, even once notified, be required to remove the same video, if re-uploaded—that has to be separately notified in order to impute the requisite knowledge. In short, there is no stay down obligation inherent in the art.14 regime. Apart from it sitting awkwardly with the eBay judgment, the practical effect of this is to shift the entire burden of compliance onto rightholders. In short, platforms benefit financially from their users uploading unauthorised content, but the investment in monitoring their services for unauthorised content is to be made by rightholders. The Opinion justifies this by describing many situations relating to copyright infringement as ambiguous in the absence of context and, as such, a general obligation would create a risk of systematic over-removal in order to avoid the risk of liability, posing an obvious problem in terms of freedom of expression. Consequently, a notification must provide evidence that would allow a diligent economic operator in its situation to establish that character without difficulty and without conducting a detailed legal or factual examination.28

In very many cases relating to copyright content, it is, in fact, not difficult or ambiguous to ascertain that the use of a song or an excerpt from a film has not been authorised.29 Instead, it is a question of whether one wants to do so and whether one is prepared to invest in such compliance. While the Advocate General’s view is informed by a genuine, and clearly both justified and important, concern over “over removal”, it is doubtful that his proposed solution either applies the existing law correctly or arrives at a good balance between the interests of users, rights holders and platforms.

Directive on Copyright in the Digital Single Market

Finally, the court had requested the parties to make oral submissions on art.17 of the Directive on Copyright in the Digital Single Market. While that Directive, which was not in force or implemented at the time the facts of these cases arose, does not bind the court, its provisions on platform liability will supersede the legal provisions at issue in this Opinion, when it comes to online content sharing service providers, such as YouTube. It is, however, noted that this apparently will not be the case in the UK, which has announced it will not bring the Directive into domestic law.

In particular, art.17 provides that online content sharing service providers communicate to the public within the meaning of art.3 and fall outside the hosting safe harbour, albeit that safe harbour is replaced with an alternative. It is noted that the Advocate General characterises this regime, not as a clarification of the existing provisions, but as something more akin to a sui generis regime—a characterisation that continues to divide law-makers, both at the EU and national levels.

Conclusions

As noted above, this Opinion comes against a policy background of concern over the virtually unchecked power of platforms and a move towards stricter regulation. There are clear and justified concerns over private censorship, that is, platforms being the sole arbiters of freedom of expression. Some of this appears to have informed the Opinion. That said, it can be questioned whether the best approach to unchecked platform power is to absolve them further from liability.

The radical departure from case law advocated by the Advocate General appears primarily motivated by a concern with safeguarding user and consumer rights. This is clearly a crucial concern and one that is important in a context where users often lack a voice. However, the Opinion’s proposed interpretations of the law would result in a balance which disproportionately favours platforms, rather than users, over rights holders. Indeed, it cannot be in the interest of users or consumers of content—films, music, books—to undermine the role of creators and producers of content. Who would then create and invest in the content so sought after by users and consumers?

It remains to be seen whether the court will follow the Advocate General. While the CJEU statistically has followed its Advocates General, this Opinion advocates a fairly radical re-think of established principles. The court’s judgment is expected in the autumn of 2020.

1 Opinion of the Advocate General of 16 July 2020, Frank Peterson v Google and Elsevier v Cyando (C-682/18 and C-683/18) EU:C:2020:586.

2 And 248 footnotes.

3 Directive 2019/790 on copyright and related rights in the Digital Single Market and amending Directives 96/9 and 2001/29 [2019] OJ L130/92.

4 Directive 2001/29 on the harmonisation of certain aspects of copyright and related rights in the information society [2001] OJ L167/10.

5 Directive 2000/31 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market [2000] OJ L178/1.

6 See, e.g. Frank Peterson v Google and Elsevier v Cyando (C-682/18 and C-683/18) EU:C:2020:586 at [188]–189] and [243].

7 Sociedad General de Autores y Editores de España (SGAE) v Rafael Hoteles SA (C-306/05) EU:C:2006:764; [2007] Bus. L.R. 521.

8 See, in particular, Frank Peterson v Google and Elsevier v Cyando (C-682/18 and C-683/18) EU:C:2020:586 at [65] and [104].

9 GS Media BV v Sanoma Media Netherlands BV (C-160/15) EU:C:2016:644; [2016] Bus. L.R. 1231; Stichting Brein v Jack Frederik Wullems (Filmspeler) (C-527/15) EU:C:2017:300;

[2017] Bus. L.R. 1816; Stichting Brein v Ziggo BV and XS4All Internet BV (C-610/15) (“The Pirate Bay”) EU:C:2017:456; [2017] Bus. L.R. 1899.

10 GS Media BV v Sanoma Media Netherlands BV (C-160/15) EU:C:2016:644 at [40] to [51].

11 Stichting Brein v Ziggo BV and XS4All Internet BV (C-610/15) (“The Pirate Bay”) EU:C:2017:456 at [3].

12 Frank Peterson v Google and Elsevier v Cyando (C-682/18 and C-683/18) EU:C:2020:586 at [73].

13 See, e.g. Frank Peterson v Google and Elsevier v Cyando (C-682/18 and C-683/18) EU:C:2020:586 at [82].

14 Frank Peterson v Google and Elsevier v Cyando (C-682/18 and C-683/18) EU:C:2020:586 at [110].

15 Frank Peterson v Google and Elsevier v Cyando (C-682/18 and C-683/18) EU:C:2020:586 at [113].

16 Frank Peterson v Google and Elsevier v Cyando (C-682/18 and C-683/18) EU:C:2020:586 at [120].

17 Frank Peterson v Google and Elsevier v Cyando (C-682/18 and C-683/18) EU:C:2020:586 at [124].

18 Frank Peterson v Google and Elsevier v Cyando (C-682/18 and C-683/18) EU:C:2020:586 at [122]–[124].

19 L’Oréal SA v eBay International AG (C-324/09) EU:C:2011:474; [2012] Bus. L.R. 1369.

20 Frank Peterson v Google and Elsevier v Cyando (C-682/18 and C-683/18) EU:C:2020:586 at [152].

21 Frank Peterson v Google and Elsevier v Cyando (C-682/18 and C-683/18) EU:C:2020:586 at [152].

22 Frank Peterson v Google and Elsevier v Cyando (C-682/18 and C-683/18) EU:C:2020:586 at [155].

23 Frank Peterson v Google and Elsevier v Cyando (C-682/18 and C-683/18) EU:C:2020:586 at [157].

24 Frank Peterson v Google and Elsevier v Cyando (C-682/18 and C-683/18) EU:C:2020:586 at [161]–[162].

25 Frank Peterson v Google and Elsevier v Cyando (C-682/18 and C-683/18) EU:C:2020:586 at [159].

26 Frank Peterson v Google and Elsevier v Cyando (C-682/18 and C-683/18) EU:C:2020:586 at [160].

27 L’Oréal SA v eBay International AG (C-324/09) EU:C:2011:474 at [120]–[121].

28 Frank Peterson v Google and Elsevier v Cyando (C-682/18 and C-683/18) EU:C:2020:586 at [190].

29 The question of use under exceptions is slightly different (and beyond the scope of this comment), but does not pose real obstacles to ensuring unauthorised content does not appear on platforms and is removed when it does so appear.