Insights Advocate General opines that a social media company can be ordered to seek and identify information that is identical to other illegal information generated by any user

Contact

The claimant, Eva Glawischnig-Piesczek, is a member of the Austrian National Council and chair and spokesperson for the Green Party.

Ms Glawischnig-Piesczek issued proceedings in the Austrian courts for defamation and applied for an injunction ordering the defendant, Facebook Ireland Ltd, to remove defamatory content from its platform. The content came from an article in the Austrian online news magazine, oe24.at, entitled: “Greens: Minimum income for refugees should stay”, which a user had shared on his personal page. The publication generated a thumbnail on the Facebook page of the website oe24.at, which comprised the title and a brief summary of the article, together with a photograph of Ms Glawischnig-Piesczek. The user also published a disparaging comment about Ms Glawischnig-Piesczek. All the content could be accessed by any Facebook user.

Ms Glawischnig-Piesczek sought an order requiring Facebook to cease publication and/or dissemination of photographs of Ms Glawischnig-Piesczek if the accompanying message disseminated the same allegations as the content in question and/or “equivalent content”.

At first instance, the court granted the order. The matter eventually reached the Austrian Supreme Court, which held that the statements in question were intended to damage the reputation of Ms Glawischnig- Piesczek, to insult her and to defame her.

As for whether the injunction could be extended worldwide to postings with identical wording and/or having equivalent content of which Facebook was not aware, the Supreme Court referred questions to the Court of Justice of the European Union on the interpretation of the E-commerce Directive (2000/31/EC).

Advocate General Szpunar noted that under the Directive, a host (including the operator of a social network platform such as Facebook) is, in principle, not liable for the information stored on its servers by third parties if it is not aware of the illegal nature of that information. However, once made aware of its illegality, the host must delete that information or block access to it. The Directive also provides that a host cannot be placed under a general obligation to monitor the information it stores, nor a general obligation actively to seek facts or circumstances indicating illegal activity.

The AG opined that the Directive does not preclude a host operating a social network platform, such as Facebook, from being ordered, in the context of an injunction, to seek and identify, among all the information disseminated by users of that platform, any information identical to information that has been found to be illegal by the court that issued the injunction.

In the AG’s view, that approach ensures a fair balance between the fundamental rights involved, namely the protection of private life and personality rights, the protection of freedom to conduct a business, and the protection of freedom of expression and information. First, it does not require sophisticated, burdensome techniques. Secondly, in view of the ease with which information can be reproduced on the internet, such approach is necessary in order to ensure the effective protection of privacy and personality rights.

As for information that is equivalent, rather than identical, to content found to be illegal, the AG also said that the host could be ordered to seek and identify that information, but only in relation postings by the user concerned. A court adjudicating on the removal of such equivalent information must ensure that the effects of its injunction are clear, precise and foreseeable. In doing so, it must weigh up the fundamental rights involved and take account of the principle of proportionality, the AG said.

The AG said that an obligation to identify equivalent information originating from any user would not ensure a fair balance between the fundamental rights concerned, as it would be costly and could lead to censorship, such that freedom of expression and information might well be systematically restricted.

Further, in the AG’s opinion, since the Directive does not regulate the territorial scope of an obligation to remove information disseminated via a social network platform, it does not preclude a host from being ordered to remove such information worldwide. Both the question of the extraterritorial effects of an injunction imposing a removal obligation and the question of the territorial scope of such an obligation should be analysed by reference to public and private international law, the AG said.

In addition, the AG opined that the Directive does not preclude a host from being ordered to remove information equivalent to the information already found to be illegal, where it has been made aware of that equivalent information by the person concerned, third parties or another source, as, in that case, the removal obligation does not entail general monitoring of information stored. (Case C-18/18 Eva Glawischnig-Piesczek v Facebook Ireland Ltd EU:C:2019:458 (Opinion of Advocate General) (4 June 2019) — to access the Opinion in full, go to the curia search form, type in the case number and follow the link).