A man walks past a German ad for Chinese video-sharing social networking service Tiktok in Berlin on September 23, 2020. Germany is revisiting a law that regulates social media content. (AFP/John MacDougall)

Germany revisits influential internet law as amendment raises privacy implications

On October 1, a new law to regulate content posted on social media platforms took effect in Turkey, The Guardian reported. Turkish journalists already face censorship and arrest because of social media posts, CPJ has found, and the law offers just one more tool to censor news. 

Yet the legislation was not solely conceived in Ankara; it follows the example of one of the world’s leading democracies – Germany. Danish think tank Justitia has charted lawmakers citing Germany’s 2017 Network Enforcement Act as an example or copying its clauses into domestic legislation in RussiaSingapore, and Venezuela, among other countries.  

Laws that seek to regulate social media content – especially those involving poorly defined concepts like “fake news” and “hate speech” – have implications for journalists who report on such subjects; they can also be used as a pretext to censor journalistic activity. Such laws have raised other concerns in some countries: France’s Constitutional Council rejected most of a draft law for online hate speech in June this year, according to Reuters.

Germany’s own law, known colloquially as NetzDG, has also been undergoing review by both politicians and experts. In an unusual turn of events, President Frank-Walter Steinmeier has held off ratifying an amendment to the law, apparently concerned about its constitutionality, according to the local digital-focused news outlet netzpolitik. Among other changes, the amendment that parliament approved in June would require social media companies to provide some details about accounts posting allegedly criminal content to a database managed by federal police for investigation, according to netzpolitik and TechCrunch.

To learn more about the amendment and its current status, CPJ spoke via video with Matthias Kettemann of the Leibniz Institute for Media Research in Hamburg, who focuses on the interaction of private and public rules for online speech. The interview has been edited for length and clarity. 

Why is the Network Enforcement Act relevant for journalists and others who care about freedom of expression online?

The Network Enforcement Act kickstarted a discourse on platform responsibility. It was one of the first attempts to introduce transparency requirements, to force [platforms] to delete content the state deemed illegal. It’s been very influential. States in Africa and in Asia have their own “NetzDGs.” Now, Brazil, Turkey, Austria, and France are working on their own versions. 

Since it came from Germany, a lot of states use it as a fig leaf for introducing less than perfect approaches to online speech – they use the template, but without the legal protections in NetzDG. So they have the law without the rule of law – a bad thing. 

Tell us about the status of the recent amendment.

Within NetzDG there is a clause saying it has to be reviewed, but even before that review was published last week, the government decided to tighten up the rules. They increased new transparency requirements, which is a good thing, and there’s a new clause allowing people to complain when their content was mistakenly taken down. But they also included a governmental database. Platforms have to communicate content and certain identifying information [about their users] if they think that a certain post violates the law. 

This is where it gets interesting. Once both chambers of parliament have accepted a law, the federal president has to sign it. The president can only decide not to in case of evident unconstitutionality. The president’s office has been in contact with the parliament and indicated that [the privacy implications of] this governmental database might be so unconstitutional as to stop the president from signing. 

This usually doesn’t happen at all, maybe once or twice a decade. It must have rubbed the president the wrong way. His policy efforts have focused recently on digitization [of certain government operations] and he seems to be sensitive to this because of the impact of the NetzDG’s reforms on digital rights. 

Could journalists be affected by the amendment, if it passed?

The content affected, such as hate speech and Holocaust denial, is already defined in the law. What has changed is the obligation to communicate personal data. It’s not targeted at journalists, but they could conceivably get into trouble if their reporting on hate speech is flagged. That’s happened from time to time, though platforms generally agree that journalism – using an image of a neo-Nazi flag in the context of an article on neo-Nazi activities for example – isn’t supposed to be taken down.  

But in light of a higher automated takedown rate due to more human moderators being at home during the COVID-19 pandemic, one would hope that all reports to such a database would be human checked. If not, it would be a huge oversight. Facebook’s recent transparency report shows a huge spike in takedowns, 100% more than last year, most likely due to a lack of intervention by human moderators. 

[Editor’s note: In August Facebook said, “Due to the COVID-19 pandemic, we sent our content reviewers home in March to protect their health and safety and relied more heavily on our technology to help us review content.”]

Platforms would have to ensure that every single report to a governmental database, if it comes, is made by someone who knows German law. If not, there is a danger of over-reporting and blocking first, and then journalists might get into trouble based on their presence in the database.

Could the requirement for a database proliferate in other countries, if it was signed into law?

Absolutely. That is the problem. One of the reasons I and other internet law experts tried to caution Germany about the law in the first place is that the country is often seen, at least by some states, as an example [in terms of its] laws. There’s no reason to think that this wouldn’t happen with the database. 

It’s especially bad because historically, data protection is something that has strong roots in both German and European Union law. This database goes in the wrong direction.

What could happen next with the amendment?

It’s difficult to say what will happen, because we don’t have a playbook for this. Parliament could be told to try again. It’s also not impossible that the amendment dies. That would be partially a shame because some of it is important, for instance, the right to the reinstatement of content. Right now, people have to go to court to get content reinstated, which is a very arduous process. But it wouldn’t be at all bad if the database withers away.

How often do you come across examples of wrongful deletion under NetzDG?

It’s so difficult to track. There are only examples of people publicizing their content being taken down after it happens. We count on media reports too, and there is the Lumen database [a Harvard University project that compiles government requests for content removal or data received by certain social media companies]. But there’s no comprehensive study. 

It’s good to have the transparency reports that the law requires, but they’re the minimum you could expect. They tell you how many pieces of content have been flagged and taken down under NetzDG. But what do those numbers tell you – out of how many total pieces of content? How many others were never flagged?  

It’s a valiant effort and the companies have committed resources to reporting. But they’re unable to provide us with data which scientists can build on. Did the Network Enforcement Act have a positive effect? We talk about it more, and platforms now have dedicated German teams, which is good. But is the internet better now than it was? No one can answer that in a way that’s backed by data.   

Many scholars at the time said the law was unconstitutional. I would agree, but four years later there’s no evidence that it has been disastrous for freedom of speech. The law was not a big attack on journalists. Those dealing in humor or satire might be slightly more impacted because that doesn’t translate well for algorithms. There was an incident like that with Titanic magazine [when the publication’s tweet parodying a politician’s anti-Muslim comments was briefly removed in 2018]. But that’s one case. It’s fun to talk about, but it’s hard to find examples of non-accidental takedowns. 

Compare that to the new Turkish social media law, where the telecommunications authority can issue takedown orders. If you couple that with obligations on companies to take down content, then you have very few legal protections. 

But German lawmakers, even those that are sensitive to human rights, say they can’t really be responsible for what other states are doing. 

What would improve these laws? 

I’m involved in the Brazilian law-making process a bit. I usually suggest that when drafting new content-related laws, legislators should ensure that the laws are not overbroad. They shouldn’t try to solve everything. The draft Brazilian law includes everything from “fake news” to data retention to a governmental database. 

It’s really important to ensure protection through courts, and limit takedown obligations to specific types of illegal content, serious crimes. It helps if the law is enforced by an authority that can interact with platforms and be responsive prior to issuing penalties. Journalists can also contribute to the process by showing which problems exist and where regulation works and doesn’t work. 

Journalists should pay attention to the wave of regulation that’s everywhere now. After 10 years of not using regulatory tools to reign in platforms as they grow and amass data, states are overcorrecting. But what they should actually do is work on the dynamics reigning in society — the trend away from social cohesion and shared knowledge, the partisan conception that you have to “believe” news, the mistrust in authorities. The Network Enforcement Act won’t have much of an impact on that. It’s something that we as society have to confront.