Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on April 13, 2017

Facebook investigated for turning a blind eye to terrorism and child abuse


Facebook investigated for turning a blind eye to terrorism and child abuse Image by: Lina R/Shutterstock

An investigation by The Times has landed Facebook in hot water after the social media giant failed to remove flagged and potentially illegal content.

A Times reporter set up a fake profile and flagged a number of posts containing (to put it mildly) unpleasant content. Some depicted the sexualization or abuse of children, and some glorified recent terror attacks or showed beheadings.

Some of the flagged posts were removed, but others — including one child abuse video — were not. According to moderators, they didn’t violate community standards. Some were even promoted to the reporter by Facebook’s algorithms.

It was only after The Times contacted Facebook officially that it was all taken down. Given that it took two weeks for Facebook to remove a video depicting a 12-year-old’s suicide, that’s probably not a surprise.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

Needless to say, those kinds of images are illegal in the UK (and pretty much everywhere else in the West). When asked what this could mean for Facebook from a legal perspective, Yvette Cooper, chair of the UK home affairs select committee, said, “Social media companies need to get their act together fast, this has been going on for too long.” She also referenced a proposed German law which would impose penalties on companies like Facebook for not removing illegal content quickly.

Facebook insists all the content the reporter flagged has now been removed. Still, it shouldn’t take fear of public shaming (or prosecution) to remove stuff like that. Justin Osofsky, Facebook’s veep of global operations, apparently agrees:

We are sorry that this occurred. It is clear that we can do better, and we’ll continue to work hard to live up to the high standards people rightly expect of Facebook.

Facebook also faces trouble in that its users sometimes don’t flag (unl)awful content like this. That was apparently the case with the livestreamed rape of a 15-year-old. But when stuff like this is flagged, it really should not take this much teeth-pulling to see it gone.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with