This article was published on August 20, 2018

EU to fine social media platforms that take more than 1 hour to remove extremist content


EU to fine social media platforms that take more than 1 hour to remove extremist content

The European Union is reportedly planning to impose stricter regulations on social media platforms including Facebook, YouTube, and Twitter over the removal of online terrorist propaganda.

The Financial Times reported on Sunday that according to new draft regulations to be published next month, the EU plans to impose fines on companies if terrorist content is not deleted within an hour of posting, abandoning its earlier approach of getting internet platforms to remove such content voluntarily.

The upcoming legislation builds on guidelines first prescribed by the EU in March for removal of radicalizing online content within one hour. Following this, the European Parliament located in Brussels promised to review the progress made by companies and come up with new legislation.

Julian King, the EU’s commissioner for security, told FT that the regulations would help to create legal certainty for platforms and would apply to websites of all sizes.

The <3 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

Last year, the UK policy think tank Policy Exchange published a report profiling the ways in which terror outfits used social media platforms like Telegram, Twitter, Facebook, and YouTube to disseminate terrorist content online. The report titled “The New Netwar: Countering Extremism Online” said:

While Telegram exists as a ‘safe haven’ for jihadists, they have not abandoned other platforms such as Twitter, Facebook and YouTube. Twitter accounts for 40% of the identifiable traffic to jihadist content online.

For their part, companies have tried using artificial intelligence to automatically identify radicalizing content online, and collaborating to build a shared database of images and videos that promote terrorism, so as to trace and remove them more quickly. Platforms could try to quickly address the problem by enforcing stricter terms of use, but they may have to worry about this driving people away since such policies restrict freedom of speech.

Despite the efforts, a study released last month by Counter Extremism Project revealed that between March and June, ISIS members and supporters uploaded 1,348 YouTube videos garnering 163,391 views with more than 24 percent of the videos remaining on YouTube for more than two hours. This was long enough for the videos to be downloaded, copied and distributed across Facebook, Twitter and other social media platforms, even after YouTube found and deleted them.

King added that Brussels had “not seen enough progress” on the removal of terrorist material from tech companies and would “take stronger action in order to better protect our citizens”.  However, the plan to toughen regulations is reportedly contested by parts of the commission which believes that self-regulation has already been successful on the platforms most utilized by terrorist groups.

The proposed regulations, if approved by the European Parliament and a majority of the EU member states, will be the first in the EU to directly target tech companies for handling illegal content on their websites.

Get the TNW newsletter

Get the most important tech news in your inbox each week.