Eva Glawischnig-Piesczek V Facebook Ireland Limited: a new layer of neutrality

Authors:

Elda Brogi, Scientific coordinator of The Centre for Media Pluralism and Media Freedom (CMPF)

Marta Maroni, Phd candidate in Constitutional Law, Faculty of law, University of Helsinki*

Last 3rd of October, the Court of Justice of the European Union (CJEU) ruled on the case of Eva Glawischnig-Piesczek V Facebook Ireland Limited (2019). The ruling is a landmark decision which revisits the understanding of prohibition of general monitoring obligations of hosting providers (Directive 2000/31, Directive on electronic commerce -hereinafter ECD) by creating a second order level/specification of neutrality built on the sophistication that automation has currently reached. Under the European regime for a platform/intermediary to benefit from liability exception, it should be considered to be a hosting provider, which means it should play a passive function: while providing a space for interaction and production of content, the platform should have no knowledge of the content disseminated through its service, yet when it comes to knowing about an illegal activity, the platform must expeditiously remove or disable the information (art.14 ECD).  In addition, EU member states should not impose a general monitoring obligation on these services. Whilst the pervasive use of algorithms has increasingly shaken the tenets of this construction, the EU judges have refined their creative skills to hold on this concept, as we can appreciate in this ruling.

Facts
of the Case

The case arose from a defamatory campaign against Eva Glawischnig-Piesczekan Austrian Green party member. The offensive material was in the form of a picture with reference to an article reporting false statements and accompanied by offensive comments.  Ms Eva Glawischnig-Piesczek requested that Facebook delete the untrue content, and after Facebook´s refusal, she went to the Austrian Supreme Court, which issued an injunction requesting that Facebook disable the illegal content and the eventual re-posting aimed at damaging the reputation of Ms Glawischnig-Piesczek in Austria and worldwide.

The CJEU was
then asked to rule on the interpretation of art. 15 of
the ECD, and whether the directive allows the imposition on Facebook to remove
illegal content infringing personality rights, and whether
the injunction can apply 

(a)      worldwide

(b)      in
the relevant Member State

(c)      to
the relevant user worldwide

(d)      to
the relevant user in the relevant Member State

and
if the injunction also applies for information with an equivalent meaning as soon as the operator has become aware of this
circumstance.

The CJEU ruling takes on board the Advocate General’s opinion which
conveyed that software can be used to reconcile the end of sharing illegal
content and the prohibition of imposing a general monitoring obligation.

The Advocate General´s opinion

AG Szpunar
in his opinion starts by recognizing Facebook as a hosting provider with the following
comment “irrespective of the doubts that
one might have in that regard
” (§ 30). The AG remarked that a judiciary
body is not prevented by the ECD from requiring an intermediary to stop and
avoid further infringements. In those cases, the intermediary does not benefit
from immunity and can

 “be ordered to seek and identify the
information equivalent to that characterised as illegal only among the
information disseminated by the user who disseminated that illegal information.
A court adjudicating on the removal of such equivalent information must ensure
that the effects of its injunction are clear, precise and foreseeable. In doing
so, it must weigh up the fundamental rights involved and take account of the
principle of proportionality.” 

In his opinion, AG
Szpunar considers that to impose a
general monitoring obligation on a host provider
would alter the nature of
the intermediary as this would become aware of the information stored and an
injunction should not do that. However, AG Szpunar also remarked that domestic courts
are in the position to prevent an illegal activity. In order to overcome this
conflict, the AG Szpunar distinguishes general
monitoring
from specific monitoring
that is monitoring applicable to a
specific case
and limited in
terms of the subject and the duration.

In addition, the AG
suggested that automation preserves the neutrality of a platform: automation
has the capacity to detect the reproduction of the same content by any user of
a social network and at the same time it does not transform (legally speaking) the nature of
an intermediary. In this passage, the AG builds upon Google France SARL (2010) which ruled that automation does not
necessarily result in the platform having knowledge of, or control over,
information.

The CJEU’s
ruling

The CJEU’s decision opens
by pointing out that Facebook was aware of the illegal content and consequently
cannot benefit from art. 14 ECD and that in any case, an injunction can be
granted even when a platform is not liable. The Court invokes recital 45 of the
ECD and remarked that the exemption regime is without prejudice to the power of
national courts to require a host provider to stop or prevent an infringement.
The Court further recalls the Member State obligation to ensure that
appropriate court actions are available; the reason is that the damage that
comes from the web may be rapid and extensive (§28).

In referring to art.15
(1), the CJEU confirmed
the interpretation of AG Szpunar: a Member State is not precluded from stopping
and preventing an illegal activity, which a Member State’s court ruling has
considered as such and the prohibition of monitoring obligations does not apply
to a specific case. (§34) The reason may well be (as the Court states), that
there is a genuine risk that the information deemed to be illegal can be easily
reproduced. (§36)

In these
circumstances, the CJEU continued,
it is legitimate for a Court to prevent the publication of “information with an
equivalent meaning”, otherwise the injunction would be simply circumvented. The
CJEU then defines information with an
equivalent meaning as

“Information conveying a
message the content of which remains essentially unchanged and therefore
diverges very little from the content which gave rise to the finding of
illegality.” (§39)

The CJEU then paves the way for the use of automated
search tools and points out that the scope of the injunction aimed at
protecting personality rights needs to be balanced and limited in scope. In
addition, it cannot impose excessive obligations on an intermediary and cannot
require an intermediary to carry out an independent assessment of the
equivalent content deemed illegal. In the view of the CJEU, automated
technologies could then seek this information, which does not require further analysis and falls within the specific
characteristics of the injunction. In addressing the question of the
territorial scope, the Court states that ECD does not preclude injunction
measures from producing effects worldwide. (§50).

COMMENT

The ruling immediately sparked the interest and the comments of scholars worldwide, particularly regarding the consequences that the interpretation of the CJEU may have on freedom of expression[1]. The main argument, that we tend to agree with, is that there is a risk of arbitrary removal of content, even when decisions are left to automation, as it is hard to gauge which and how much content has been taken down by automatic systems, nor the criteria on which the removal was based. The criterion of “equivalent content” mentioned by the Court cannot be considered unambiguous, nor clear enough to be left to the interpretation of both a private company and an algorithm. Besides these criticisms, it is also interesting to think of the wider consequences of this ruling in defining the role of law in the new digital scenario.

The CJEU decision
is based on the assumption that according to the ECD, Facebook should be
treated as a hosting provider[2]:
given that, the main endeavour of the Court becomes to apply the legal
framework of the ECD coherently. The CJEU then interprets the provisions of the
ECD to include social media, where the principle that must stand is that social
media, should not have general monitoring obligations imposed on them and the
removal of illegal content should be based on a case by case assessment. By
allowing automatedremoval systems
for equivalent content, the ruling
shields the neutrality of the social media[3]:
With this case, the CJEU acknowledges a second layer of neutral intermediation, which is the automated
search filters and delegates platforms to decide the criteria of content
removal.

With this ruling, the Court falls short not only in omitting how social media might also actively contribute to the spreading of illegal content, but even more problematic by relying on the role of the automated system for the removal of content, the CJEU is legitimating a new level of neutrality and yet creating a further fictio iuris.

It is
probably true that for the time being, the ECD is the best piece of legislation
at our disposal for coping with challenges on liability on the Web. The CJEU
ruling is a clever move to exploit the potential of the existing laws. However,
the attempt to squeeze any case into this framework, in the long term, can be
counterproductive, as it allows for delegation of power to the platforms,
entitled a “case by case self-regulation”, with no accountability.

The territorial scope

The question
of extraterritoriality also deserves attention. Contrary to the mainstream
understanding, the CJEU did order a “worldwide take down”, it seems instead
that the Court clarified that Member States are not prohibited from asking for worldwide
implementation of the injunction. The ECD is silent on whether Member States
have a “limitation, including a territorial limitation, on the scope of the
measures which Member States are entitled to adopt in accordance with that
directive”. To rule otherwise, the CJEU would have created a norm where a
provision does not exist, and it would have trespassed within the competencies
of Member States, which under EU law retain primary legislative power on
criminal law matters.

Although the
CJEU further clarified that Member States may order a host
provider to remove information or to block access to that information worldwide
if this is compatible with the applicable international law, it does not
mention specifically which provisions of international law are relevant.

*Marta Maroni would like to acknowledge the support received from Reconfiguring Privacy – A Study of the Political Foundations of Privacy Regulation-funded by the University of Helsinki and Transparency in the EU- From Reaction to Manifesto? Funded by the Academy of Finland

[1]
Eliska Pirkova, No summer break for free expression in Europe: Facebook cases
that matter for human rights, https://www.accessnow.org/no-summer-break-for-free-expression-in-europe-...

[2]
But
see § 36 where the Court states that “Given that a social network facilitates
the swift flow of information stored by the host provider between its different
users, there is a genuine risk that information which was held to be illegal is
subsequently reproduced and shared by another user of that network.” This is a
description that recalls the “act of communication to the public”, that is the
base of the definition of “online content-sharing service provider” in the
Directive (EU) 2019/790 on copyright and related rights in the Digital Single
Market and amending Directives 96/9/EC and 2001/29/EC

[3]
see Google
France SARL and Google Inc. v Louis Vuitton Malletier SA (C-236/08)
The post Eva Glawischnig-Piesczek V Facebook Ireland Limited: a new layer of neutrality appeared first on Centre for Media Pluralism and Freedom.