A group of 88 civil liberties groups have penned an open letter to Facebook requesting the company allow users to appeal whenever their posts are removed — an option they currently do not have.
The groups — which include the Electronic Frontier Foundation (EFF), the American Civil Liberties Union (ACLU), and the Digital Rights Foundation — address CEO Mark Zuckerberg, asking that he include an option for the site’s users to defend themselves against potentially unnecessary censorship.
Facebook revealed more about its internal enforcement rules earlier this year, breaking them down into different categories with specific instructions about which posts violate the rules and which don’t. At the same time, the company expanded its appeals process in order to allow users to contest the removal of posts for nudity, sexual content, or violence.
As the groups say in their letter:
This is a positive development, but it doesn’t go far enough. Today, we the undersigned civil society organizations, call on Facebook to provide a mechanism for all of its users to appeal content restrictions, and, in every case, to have the appealed decision re-reviewed by a human moderator.
In addition to the revision to the appeals process, the groups requested greater transparency in every decision, with a user being told exactly which rule the post violated and how it was detected. Users should also be allowed to appeal the decision with evidence and have the appeal judged by a fresh adjudicator.
As the groups pointed out in the letter, Facebook has a rather slipshod history of applying its own rules, and sometimes removes inoffensive things in error. The first example that springs to mind is when it removed a photo of the Venus of Willendorf statue despite such art being explicitly allowed in its rules. Another time, it censored the image of the nude “Napalm Girl,” despite its established historical significance. My favorite blunder, though, has to be the time it removed a post containing a passage from the Declaration of Independence.
In all the above cases, the company insisted it removed the photo or post in error.
Finally, the groups request Facebook reveal more data about content takedowns, including how much content was censored, which guidelines it allegedly violated, and how many posts were removed in error.
The suggestions are essentially an application of the Santa Clara Principles, a group of rules developed by several of the same cosigners of this letter that help tech companies improve their moderation policies. As the EFF puts it, “The plain language, detailed guidelines call for disclosing not just how and why platforms are removing content, but how much speech is being censored.”