The heart of tech is coming to the heart of the Mediterranean. Join TNW in València this March 🇪🇸

This article was published on April 8, 2019

EU proposal requires platforms to delete ‘terrorist content’ in 1 hour or face fines

The EU's proposed terrorist regulation risks unwanted censorships and endangers smaller platforms.

EU proposal requires platforms to delete ‘terrorist content’ in 1 hour or face fines
Már Másson Maack
Story by

Már Másson Maack

Editor, Growth Quarters by TNW

Már tries to juggle his editorial duties with writing the occasional weird article. He also loves talking about himself in the third person. Már tries to juggle his editorial duties with writing the occasional weird article. He also loves talking about himself in the third person.

Update April 8, 5pm CET: The LIBE committee voted in favor of including the one-hour removals. Read on to find out how that might affect the internet.

Just 12 days after voting ‘yes’ on the controversial Copyright Reform, the EU is discussing new regulation that might further hurt smaller platforms. The EU Parliament’s Committee on Civil Liberties, Justice, and Home Affairs (LIBE) will vote today, April 8, on implementing a one-hour deadline for any platform to delete terrorist content.

The current proposal will give platforms a 12-hour deadline the first time terrorist-related content has been flagged, but only one hour for following removal requests. Just like GDPR, a platform could be fined up to four percent of its global turnover for not complying — an incredibly serious sanction.

But why is this a big deal? Terrorist content is already illegal, so it should be fine to have platforms take it down within one hour, right? Well, MEPs Julia Reda (Pirate Party, Greens-EFA) and Sophie in ‘t Veld (D66, ALDE) disagree. A rushed removal process will risk unwanted censorship, make it harder to run smaller platforms, and hurt the diversity of the internet.

“Not only is it practically unfeasible for small businesses, it’s also not enough time to assess the removal order and its impact on fundamental rights,” in ‘t Veld points out.

What’s important to keep in mind is that the one-hour rule wouldn’t just apply to the likes of YouTube or Facebook, who can afford to hire a team to rapidly respond to removal orders 24/7. Reda explains the obligation applies to any website where users can upload material autonomously, even if it’s just a comment section of a blog or a community project like Wikipedia.

Reda says that even though the EU Commission (which is in favor of the one-hour rule) estimates that only one to two percent of providers in Europe will be exposed to terrorist content, the legislation will place significant organizational and financial hurdles to almost all platforms.

“Small website owners either have to ensure constant reachability (no more switching your phone to silent at night or during the weekend), or use automated tools to automatically block any content reported to them by authorities, without any human oversight,” Reda explains. “These automated tools that are likely to also affect legal content are precisely what we want to avoid.”

Sophie in ‘t Veld (left) and Julia Reda (right)

Reda also worries about the diversity of platforms online if this one-hour deadline will be implemented in the final legislation:

“One likely outcome is that more independent, hobbyist, or small business websites will shut down and move their activities to third-party platforms like Facebook, who can take care of compliance with the terrorism regulation. EU policy-makers should put in place measures to increase the diversity of the internet ecosystem, instead of propping up already dominant internet giants.”

The EU isn’t alone in losing faith in online companies’ self-regulation when it comes to terrorist content. The UK is looking to introduce legislation that allows social networks to be blocked or fined if they fail to stop the spread of harmful content, such as misinformation, terrorist propaganda, and content depicting child sexual abuse. Australia passed a similar bill that’ll punish platforms and their executives if they fail to rapidly remove “abhorrent violent material,” and Germany has implemented a 24-hour window to remove “obviously illegal” content.

None of these legislations come close to the incredibly tight one-hour removal being proposed in the LIBE committee today. Reda’s colleagues will likely vote against including it, while in ‘t Veld will table an amendment that doesn’t have a time limit, and instead will demand removal “without undue delay.” This will hopefully mitigate the negative effects the legislation might have on smaller platforms.

A systemic issue

Sophie in ‘t Veld says what’s particularly dangerous about this legislation is that it’s being rushed through, not giving MEPs and committee members enough time to do their job effectively.

“Legislation proposals in the area of counterterrorism and security are systematically not accompanied by an impact assessment — even if it’s required,” in ‘t Veld says. “And all the advice given, for example, by the European data protection supervisor or fundamental rights agencies — in addition to rulings by the European Court of Justice — are systematically ignored.”

In ‘t Veld is disappointed in the EU Commission and member states for pushing this through in this manner, on top of the upcoming EU parliament elections. The Parliament can’t function as the failsafe it’s supposed to be in creating European law under these circumstances. “I think it’s very unfortunate that we have to process it this way. I don’t think that this is a serious way of legislating,” says in ‘t Veld.

There appears to be a pattern lately within the EU of trying to react to problems online with patchwork regulation. As an example of how this can have unintended consequences, the Copyright Reform was implemented in reaction to the evolving online environment for publishers, but it threatened open source communities as it approached the issue from the wrong end.

This terrorist regulation is largely in response to the Christchurch attack earlier this year and the video the attacker streamed, according to in ‘t Veld. Everybody agrees on the necessity of curbing the spread of terrorist content in the future, but the struggle is how to do it. For in ‘t Veld and Reda, it’s clear it has to be done responsibly and in line with our technological reality.

If the one-hour rule will be approved today, it’ll become the Parliament’s position on the topic and make it near impossible to prevent it from becoming law. Julia Reda says the European Commission initially proposed the rule, and the Council continues to support the rule in the negotiation position it adopted last December. The last phase of making it official law is inter-institutional negotiations, and if all three parts of the EU agree on it — Commission, Council, and Parliament — then it’s likely to go through.

“If the LIBE committee adopts the one-hour rule today, our best chance of removing it will be the plenary vote on the LIBE report, which is likely to happen in mid-April,” Reda explains.

We’ll update this article once the vote in the LIBE committee is done.

Also tagged with