On 22 June 2021, the CJEU ruled in the joined cases of LF v Google LLC, YouTube & Ors (C-682/18) and Elsevier v Cyando (C-683/18) on the liability of online platform operators in relation to:
The scope of communication to the public under Article 3(1) of the InfoSoc Directive (2001/29)
The applicability of Article 14(1) of the E-Commerce Directive (2003/31), and
The circumstances under which an injunction may be granted to a rightholder under Article 8(3) of the InfoSoc Directive
(See here for our previous blog on the Advocate General's opinion in July 2020 – Online platform operators should not be directly liable for illegal uploads by users (currently)).
In a nutshell, the CJEU ruled that operators of online platforms do not, in principle, themselves make a communication to the public of copyright-protected content illegally posted online by users of those platforms. However, those operators do make such a communication, and therefore infringe copyright, where they contribute, beyond merely making those platforms available, to giving access to such content to the public.
The CJEU also confirmed that such operators may benefit from the exemption from liability under the Article 14 safe harbour hosting defence unless they play an active role so that they have knowledge of or control over the content on their platform.
In the YouTube case, Music producer, Frank Peterson is seeking damages from YouTube in respect of videos of singer Sarah Brightman that were available on YouTube and Google, without his authorisation. The second case concerns proceedings brought by the publishing group, Elsevier, regarding unlawful medical material uploaded to Cyando's platform Uploaded.
Summary of the legal issues
In summary, the questions referred were:
Whether the operator of a file sharing platform carries out a communication to the public under Article 3(1) of the InfoSoc Directive and could be considered primarily liable for hosting content uploaded by users on its platform, particularly when it earns advertising revenue but the files are uploaded automatically without any review or control in advance; and
If not liable under the InfoSoc Directive, whether the activities fall within the safe harbour exemption under Article 14 of the E-Commerce Directive?
Article 3(1) – 'communication to the public'
Taking into account criteria established in previous case law, the CJEU assessed the objectives and the concept of a 'communication to the public'. Taking a less trenchant view than the Advocate General, the CJEU attempted to strike a balance between the interests of copyright holders and the importance of the internet for fundamental rights, such as freedom of expression and information. The CJEU made it clear that the concept of communication to the public included two cumulative criteria (i.e. an act of communication of a work and the communication of that work to a public) which required an individual assessment, based on several complementary criteria.
As part of assessing whether the platform operator carries out an 'act of communication', the CJEU emphasised the importance of the indispensable role played by a platform operator and the deliberate nature of its intervention. The CJEU held that merely making the platform available for users to autonomously make protected content available to the public would not constitute a 'communication to the public' under the meaning of the InfoSoc Directive. Instead, making an 'act of communication' would occur when it intervenes to give its customers access, in full knowledge of the consequences of its actions, to a protected work where it would not otherwise have been available, therefore further contributing to making illegal content available, for instance (and see paragraph 84, a key part of the ruling):
Where that operator has specific knowledge that protected content is available on its platform and refrains from deleting or blocking access expeditiously;
Where, despite the fact that it knows in a general sense that content is being made available illegally, fails to put in place technological measures in order to counter copyright infringement on its platform; or
Where the operator selects the illegal content or provides tools to specifically aid the sharing of content, attested by the fact that the operator has adopted a financial model that encourages the illegal sharing on its platform.
An important point also clarified by the CJEU was that just because a platform operator is profit-making, that is not enough to establish its deliberate intervention in the illegal communication of protected content. That fact that it provides services for profit does not mean it therefore consents to having those services used by users to infringe copyright.
It is also not sufficient if the platform operator merely knows, in a general sense, that the platform is being used to make protected content available illegally.
Although the CJEU held it was for the referring court to decide on the liability based on the particular facts of the case, it did refer to how the criteria might apply to these platforms. For example, the court highlighted YouTube's lack of intervention in the uploads by users, its terms and conditions prohibiting copyright infringement, the technical measures in place to prevent and end copyright infringement and was of the view that the financial model was not based on the presence of illegal content. This suggests that, although they were not making a decision on liability, they would consider YouTube to not be carrying out a relevant communication.
Article 14(1) – safe harbour exemption
The CJEU then considered the second question referred relating to the scope of the safe harbour exemption under Article 14(1) of the E-Commerce Directive.
The CJEU concluded that an operator could only rely on this exemption from liability if it did not play an active role that would give it knowledge or control over the content uploaded on its platform. This went beyond a merely technical, automatic and passive role. For an operator to be excluded from the exemption (i.e. to still be open to liability), it would need to have knowledge, or awareness of specific illegal acts carried out by its users in relation to unlawful content uploaded on the platform. It was not enough for it to be aware, in a general sense, of the fact that its platform was being used to share content which may infringe intellectual property rights meaning that it had an 'abstract knowledge' that protected content was being made available illegally on its platform.
Further, the fact that an operator automatically indexed content uploaded to that platform, that the platform had a search function and that it recommended videos on the basis of users’ profiles or preferences was not a sufficient ground to conclude that the operator had ‘specific’ knowledge of illegal activities carried out on that platform or of illegal information stored on it.
The CJEU confirmed that if the referring German court was to find that YouTube or Cyando were communicating to the public (i.e. because they contributed, beyond merely providing their platforms, to giving the public access to protected content in breach of copyright) they would not be able to rely on the exemption from liability under Article 14.
The CJEU made it clear that any notification provided to an operator in relation to infringing content should contain sufficient information that would enable the operator to deduce whether the content was infringing, without any detailed legal examination and whether any removal would be compatible with the right to freedom of expression.
Article 8(3) – injunctions against intermediaries
The CJEU stated that in relation to circumstances under which rightholders can obtain injunctions against intermediaries, Article 8(3) of the InfoSoc Directive must allow for national law to prevent a rightsholder from obtaining an injunction against an intermediary where that intermediary has no knowledge or awareness of any infringement, unless, before legal proceedings were commenced, it had been notified of infringement, but failed to act expeditiously to remove or block the content.
Such a condition (which exists under German law), although not required under EU law, was considered compatible with the objectives of the InfoSoc Directive in that it allows for the correct balance between, on the one hand, the protection of IP rights, and on the other hand, the freedom to conduct a business enjoyed by service providers and users' right to freedom of expression and information.
The CJEU stated that 'while such a condition allows illegal information to be removed or blocked, it is also intended to require the rightholder, first, to give the service provider the opportunity expeditiously to bring the infringement concerned to an end and to prevent its recurrence, without that service provider, who is not liable for that infringement in accordance with Article 14(1) of the Directive on Electronic Commerce, being exposed unduly to court costs and without the rightholder being deprived, second, of the option of applying for an injunction to be issued against that same service provider, where that provider does not fulfil its obligations'.
Interplay with Article 17 of the new DSM Copyright Directive (2019/790)
It is important to be aware that proceedings in this case were issued under the InfoSoc Directive and not under the new Copyright Directive (2019/790) which has a different and stricter liability regime for online platforms. The CJEU did refer to the new Copyright Directive, but only to confirm that the questions referred did not concern the set of rules established by Article 17 of that Directive, which came into force subsequently.
Article 17 sets out a new liability regime for certain 'online content-sharing service providers' ("OCSSP" as defined in the Directive), aimed at online platforms such as YouTube. Under the new regime, platform operators become directly liable for copyright-protected content uploaded by users unless they have been able to obtain or demonstrate that they have made best efforts to obtain authorisation from rightsholders. If a service provider is found to be liable under Article 17, it cannot rely on the hosting safe harbour provision under Article 14 of the E-Commerce Directive as it is no longer applicable in these circumstances.
Given the contrasting nature of the two regimes, had this case been determined under the new regime, the outcome may have been different, although in both regimes, the platform operators need to make a somewhat proactive effort to control content on their platforms.
Although there is a new high profile regime for online platforms under Article 17 of the new Copyright Directive, that only applies to platforms that fall within the specific definition of an OCSSP. Therefore, for platforms that are outside Article 17, the interpretation in this case will still be applicable. Also, Article 17 is currently facing a legal challenge by the Polish Government (C-401/19) (see more on this and on Article 17 in general in our blog, European Commission's guidance on Article 17 of the new Copyright Directive…) and its validity is therefore still in question.
Of course, we must also remember that this ruling does not apply in the UK now it is outside the EU, so it is important to continue to fully understand the scope of communication to the public under Article 3 of the InfoSoc Directive as interpreted in retained CJEU judgments. The UK is not bound by this particular ruling, given that it has been delivered post-Brexit, but the courts may still consider it when interpreting the law in the UK, especially considering the previous body of retained case law upon which this decision is built.
The CJEU made clear that it is for the referring court to confirm how this decision on the scope of the relevant laws applies to YouTube and Cyando. However, as set out above, the suggestion is that on these facts, the CJEU did not seem to consider it likely that the platforms would be liable, which is good news for platform operators. The case also provides some helpful, practical guidance on steps that platform operators can take in order to ensure they are not carrying out 'a communication' and are able to fall within the safe harbour defence. There is no prescriptive checklist as such, but a useful yardstick nonetheless.
As the law continues to develop in this area and discussions surrounding the proposed Digital Service Act are ongoing (which sets out yet another liability regime for service providers), it will be interesting to see the impact they have on cases like these in the future.
With special thanks to Monique Caprice, trainee Legal Executive, for her contribution to this article.
Sign up to our email digest