Skip to main content

Delfi v Estonia: Increased risk of liability for online news portals?

Delfi AS v. Estonia [2015] ECHR 586OverviewOn 16 June 2015, the Grand Chamber of the European Court of Human Rights approved the decision of a Member State court, which held that a provider of an Delfi AS v. Estonia [2015] ECHR 586


On 16 June 2015, the Grand Chamber of the European Court of Human Rights approved the decision of a Member State court, which held that a provider of an online news portal was liable for offensive comments posted by third party readers below one of its published articles. The Grand Chamber held that whilst the decision of the national court interfered with the applicant's Article 10 rights to freedom of expression under the European Convention of Human Rights, such interference was ultimately justified.


The applicant was the owner of one the largest news portals in Estonia that published up to 330 news articles daily, attracting about 10,000 reader comments. The article in question concerned public transport ferry services provided by SLK, a public limited liability company, which L was sole or majority shareholder of at the time. Within the first couple of days of publication 185 comments were made and around 20 of these contained highly offensive and/or threatening content directed at L. The website implemented an automatic preventative filtering system and any reader could report an inappropriate comment, triggering its expeditious removal. However, despite these mechanisms in place, the comments containing evident expressions of hatred and blatant threats were removed from the site only after a request from L's lawyers, six weeks after the comments had been posted. The Estonian court awarded L EUR 320 in compensation for non-pecuniary damage. Delfi applied to the ECtHR on the basis that its Article 10 rights had been violated. The first instance Chamber judgment found no violation of Article 10. Delfi then appealed to the Grand Chamber.


The Grand Chamber judgment agreed with the domestic court and first instance Chamber judgment, stating three key points:

The appellant was not merely technical, automatic and passive in the monitoring of comments, it was a large, professionally managed portal run on a commercial basis with a substantial degree of control over the comments posted on the website. The appellant held an economic interest in proactively inviting readers to publicly comment on articles, as the number of visits to the website depended on the number of comments, and the revenue earned from the advertisements published relied on the number of visits. In addition, the reader's comments were integrated into the news portal. Consequently, the appellant was said to be a publisher and could not rely on the limited liability defence for internet service providers under the E-Commerce Directive on the basis that it was a passive, mere technical service provider.

Secondly, the prior automatic filtering and notice-and-take-down system used by the website was insufficient in protecting the rights of third parties. Although the comments were taken down the same day that L's lawyers sent the take-down request, this was over a month after the comments were first posted. The Chamber recalled that the comments were manifestly unlawful. It was found to be amongst the duties and responsibilities of the internet service operator to protect the rights of others by expeditiously removing unlawful speech, whether the commentators were identified or not. The website had substantial control over reader's comments and made the choice to automatically upload non-registered users comments without prior moderation.

Thirdly, the Chamber dismissed the appellant's argument that responsibility should be borne from the point in which the company is made aware of the offensive comments on the basis that it was a restriction on the company's freedom to impart information and the increased responsibility was not commercially viable. The Grand Chamber disagreed, stating that placing the burden to report the injurious comment on the victim was unjust as their abilities were limited, whereas a large commercial internet news portal had sufficient resources to monitor users' content. Ultimately, a balance had to be struck in protecting the rights to exercise freedom of expression on the internet and to restrain defamatory or other types of unlawful speech impeding on an individual's personal rights. For this reason it was considered that although the domestic court's decision was an interference of the appellant's Article 10 right to freedom of expression, in this case it was appropriate in protecting the rights of others.

The Grand Chamber specified that this judgment was not imposing a form of "private censorship" nor was it applicable to "other fora on the internet" where third-party comments can be disseminated. Examples given included discussion forums or bulletin boards where users can freely set out their ideas on any topics without the discussion being channelled or moderated by any input from the operator; or a social media platform where the platform provider does not offer any content; or where the content provider may be a private person running the website or a blog as a hobby.


This decision may have significant impact on website operators and publishers. Websites which exercise a substantial degree of control over user generated content must now be more vigilant in identifying and removing that content which is "manifestly unlawful" (e.g. hate speech) of their own initiative. The Grand Chamber's judgment makes clear that operators must be proactive in removing such content rather than simply being informed by take-down requests to initiate removal.

Sign up to our email digest

Click to subscribe or manage your email preferences.