An investigation by The Times has landed Facebook in hot water after the social media giant failed to remove flagged and potentially illegal content.
A Times reporter set up a fake profile and flagged a number of posts containing (to put it mildly) unpleasant content. Some depicted the sexualization or abuse of children, and some glorified recent terror attacks or showed beheadings.
Some of the flagged posts were removed, but others — including one child abuse video — were not. According to moderators, they didn’t violate community standards. Some were even promoted to the reporter by Facebook’s algorithms.
It was only after The Times contacted Facebook officially that it was all taken down. Given that it took two weeks for Facebook to remove a video depicting a 12-year-old’s suicide, that’s probably not a surprise.
Needless to say, those kinds of images are illegal in the UK (and pretty much everywhere else in the West). When asked what this could mean for Facebook from a legal perspective, Yvette Cooper, chair of the UK home affairs select committee, said, “Social media companies need to get their act together fast, this has been going on for too long.” She also referenced a proposed German law which would impose penalties on companies like Facebook for not removing illegal content quickly.
Facebook insists all the content the reporter flagged has now been removed. Still, it shouldn’t take fear of public shaming (or prosecution) to remove stuff like that. Justin Osofsky, Facebook’s veep of global operations, apparently agrees:
We are sorry that this occurred. It is clear that we can do better, and we’ll continue to work hard to live up to the high standards people rightly expect of Facebook.
Facebook also faces trouble in that its users sometimes don’t flag (unl)awful content like this. That was apparently the case with the livestreamed rape of a 15-year-old. But when stuff like this is flagged, it really should not take this much teeth-pulling to see it gone.