Germany: Draft "Network Enforcement Law" to tackle hate speech and fake news | Fieldfisher
Skip to main content
Publication

Germany: Draft "Network Enforcement Law" to tackle hate speech and fake news

05/04/2017

Locations

Germany

German Minister of Justice and Consumer Protection Heiko Maas attracted the attention of many lawyers and legal experts with a new draft law about enforcement within social networks.

German Minister of Justice and Consumer Protection Heiko Maas (hereafter the “Justice Minister”) attracted the attention of many lawyers and legal experts with a new draft law about enforcement within social networks (Netzwerkdurchsetzungsgesetz, hereafter “NetzDG"). A lot of that attention has evoked intense criticism. Despite this criticism the Justice Minister presented a revised draft which intensified the designated legal situation in some aspects. Many involved parties consider this revised draft as an unfair violation because it was already presented to the European Commission for notification while the deadline for comments during the legislative process was still running.

The law's objective is to take action against "hate speech" and "fake news". Especially in light of upcoming elections in Germany this year, the topic is highly sensitive. Political discussions are a core element and par excellence constitutive for a modern democracy. However a legal framework that protects the rights of individuals seems important to guarantee a factual basis for such discussions. The question is where to draw the line for such a framework. For the Justice Minister this line seems to be clear.

One year ago a "Hate-Speech"-taskforce was formed by the Justice Minister. Mid-March 2017 he presented the network enforcement law with the intention to force large social networks to delete certain content quicker. The law states that voluntary self-commitment of providers is insufficient to face the online debate-culture that is "often aggressive, offending and not rarely full of hate". Therefore it would be necessary to oblige providers of social networks by law to administer their "responsibility for social debate-culture".

1. Scope of application

The NetzDG is applicable for social networks. The legal definition for a social network can be found in Section 1. Thus, a social network is a “tele-media service provider that – with the intention to earn profit - operates a platform online, which enables users to exchange discretionary content with other users, share it or bring it to public”. However, only social networks with more than two million users of a German IP-address will be regulated by the law. Also platforms with journalistic-edited offers that are accounted for by the provider are not covered.

This definition is wide ranging and indistinct. It is not guaranteed that only conventional social networks like Twitter or Facebook are affected by the law, mainly because of the part of the definition "bring to public discretionary content". The official justification for the law states that "only social networks with no specific subject and user determination" are captured. It further states that occupational networks are not covered either by the scope of application. However, these restrictions cannot be found in the actual law (Section 1).

With the wording used in the draft it is highly likely that more platforms than originally intended will be covered by the law. The prerequisite of "discretionary content" can easily be fulfilled platforms that only intent to cover specific subjects (e.g. music, gaming and business) because users are mostly free to choose the topic of discussions.

Furthermore, the limitation of “two million users” seems difficult to define, as the number of registered users can range daily or weekly. Thus, a platform can fall under the scope of the law in one week and leave that same scope the next week. Finally, it will prove rather difficult to justify the privilege of social networks that have no mandatory registration.

2. Obligations for platform operators

The main obligation for social media platforms is to delete and block unlawful content. The law provides a procedure-model for this process and purports three different time-scopes. First, the provider has to take notice of a complaint immediately and examine its unlawfulness. Within 24 hours evident unlawful content shall be deleted or blocked. Other unlawful content shall be deleted within 7 days.

Most providers don't have any background information about the content itself, which is needed to determine whether it's unlawful or not. To decide in such a short period of time bears the risk that the balancing between freedom of speech and individuals rights of others is made precipitously. A topic so utterly important for fundamental rights cannot be accelerated without any dangers. The risk of a misjudgement is imposed only on the provider, very likely leading to negative effects due to time pressure. Providers could delete content anyhow, simply to evade high penalties (the amount of possible penalties can reach from EUR 500,000 to EUR 50,000,000).

Within the new cabinet draft which will be enacted 5th of April, some additional, notable changes have been made as opposed to the original draft. The revised draft clarifies that a one-time violation usually does not meet the criteria for imposing a penalty, as it does not per se indicate a generally insufficient procedure for the handling of user complaints. Platform operators can also notify the penalty office about a cautious procedure for the protection of the freedom of speech if they believe that content is not unlawful. If it's not possible for the platform operator to clarify the truth content within a comment within a deadline, no summary proceedings will be initiated.

Providers must save deleted content domestically for 10 weeks. Considering Data Protection Law this could lead to insecurity in various ways. With the revision of the first draft the period of data retention was set down to 10 weeks. In the first draft, it was planned to have a retention of the content for a unlimited amount of time. This would not have complied with the measures that the German Federal Constitutional Court  and the ECJ  stated.

3. Change of the German Broadcast Media Act (Telemediengesetz)

During the revision of the first draft, a second article was added that brings a change to the above mentioned law. The change concerns a permission regarding privacy laws, stating that providers can provide information about stock data of users to relevant authorities. The article aims at adding further possible rights that have to be violated to get information from a provider, whilst originally it was only possible to get the information if intellectual property was infringed. The new draft also adds infringements of "other absolute protected rights" for a right to disclosure, which means infringements of personal rights.

As of today, however, there is no right of disclosure regulated explicitly by German law, and the German Broadcast Media Act regulates not only social network providers but also other tele-media propositions.

4. Privatization of Prosecution

Many critics are afraid of the fact that the state wants to use providers as deputies for legal prosecution (some even consider this “Opinion-Police”). The anxiety is based on possible consequences for freedom of speech, as it is usually responsibility of the Courts only to  determine whether an opinion is true and lawful. The determination is highly complex, especially for critical statements about politics. A platform doesn't have the resources and investigation rights to make that decision.

 

Sign up to our email digest

Click to subscribe or manage your email preferences.

SUBSCRIBE