Facebook today revealed the internal guidelines by which its moderators review content, meaning users will finally have a clearer idea of why some content is taken down, and how they can appeal the decision.
What it is: The complete list of community standards is divided up into six sections, each one tackling the specifics of a kind of content. For example, they have different lists for objectionable content, such as hate speech, and respecting intellectual property. Each section is now augmented with very specific rules about what is and is not a violation of the standards.
According to Monika Bickert, Facebook’s VP of Global Policy Management, the rules haven’t changed. The company is just revealing how it handles violations:
…for the first time, we are publishing the internal implementation guidelines that our content reviewers use to make decisions about what’s allowed on Facebook.
Why it matters: Facebook is currently being grilled for a lack of transparency in a flurry of data privacy scandals, which have made it all the way to Senate hearings. A criticism leveled at the company for years has been its lack of openness with regards to what content is allowed on its platform. Stories of inconsistent treatment and unfair content takedowns are as old as Facebook itself. Now users have written rules they can point to when questioning a decision.
But wait. There’s more: In addition to making the guidelines public, Facebook is also opening an appeals process for users who feel their material was removed in error. If the company wants to make appeals more constructive, it needs to give users the option to appeal when the company doesn’t take down content — frequently users can be frustrated when content they find offensive or harmful isn’t removed from the site.
The company’s also holding forums in May in “Germany, France, the UK, India, Singapore, the US and other countries” in order to take user feedback directly.
The Next Web’s 2018 conference is almost here, and it’ll be 💥💥. Find out all about our tracks here.