In a recent Change.org petition, titled “Dear Facebook, thanks for the ‘Safety-Check,’ but on figting ISIS, you can do much better!” Julie Guibault criticized Facebook’s handling of pro ISIS accounts claiming responsibility for the attacks as well as posting questionable material of the Bataclan massacre in Paris.
Facebook did eventually remove these messages, but it’s the matter of timeliness — or a lack thereof — that Guilbault took issue with.
According to Guilbault:
When an Art lover posted « The Origin of the World » by Gustave Courbet on Facebook, unsurprisingly, the ingenious porn-detection algorithm found it immediately.
But when it comes to advocating terrorism and publishing decapitation videos: no worries, they enjoy a comfortable delay before the content or their account will be deleted.
This “moderation” policy is patently absurd!
The petition gathered over 135,000 signatures from those that supported its message.
Today, Facebook’s Head of Global Product Policy, Monika Bickert responded:
When content is reported to us, it is reviewed by a highly trained global team with expertise in dozens of languages. The team reviews reports around the clock, and prioritizes any terrorism-related reports for immediate review.
We remove anyone or any group who has a violent mission or who has engaged in acts of terrorism. We also remove any content that expresses support for these groups or their actions. And we don’t stop there. When we find terrorist related material, we look for and remove associated violating content as well.
When a crisis happens anywhere in the world, we organize our employees and, if necessary, shift resources to ensure that we are able to respond quickly to any violating content on the site. For instance, in the wake of the recent attacks in Paris, we also reached out immediately to NGOs, media, and government officials, to get the latest information so that we were prepared to act quickly. Many of our employees, especially our French speakers, worked around the clock to respond to the spike in reports from our community.
Bickert was also quick to point out, that Facebook tries to remain relatively hands-off when users share “upsetting content” to promote awareness of an issue.
Of course, the Paris attacks weren’t one of these times, but the distinction is an important one when trying to grasp just where Facebook stands on offensive imagery.
Many people in volatile regions are suffering unspeakable horrors that fall outside the reach of media cameras. Facebook provides these people a voice, and we want to protect that voice.
➤ Facebook responds to critics of its policies against terrorism [Mashable]
Get the TNW newsletter
Get the most important tech news in your inbox each week.