Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on June 19, 2012

Facebook and safety: A behind-the-scenes look at what happens when you click “Report”


Facebook and safety: A behind-the-scenes look at what happens when you click “Report”

With so much information passing before our eyes on Facebook, we’re bound to find ourselves in a situation where we find things that are inappropriate. Also, with so many people using a site like Facebook, there’s a possibility of it being abused by some bad actors and finding ourselves being harassed.

Facebook is well-equipped to deal with issues like this, but the company hasn’t been completely clear on what happens to the things that you report to them. To clear up some of that confusion, the company has posted some detailed information on what happens when you click that “Report” button.

Here’s what the security team had to share today:

At Facebook, there are dedicated teams throughout the company working 24 hours a day, seven days a week to handle the reports made to Facebook. Hundreds of Facebook employees are in offices throughout the world to ensure that a team of Facebookers are handling reports at all times. For instance, when the User Operations team in Menlo Park is finishing up for the day, their counterparts in Hyderabad are just beginning their work keeping our site and users safe.

The post goes on to describe how Facebook splits teams up into different report types like: Safety, Hate and Harassment, Access, and Abusive content. Depending on what type of issue you’re reporting, the company has a series of systems and processes that the report flows through.

If you report content that is against Facebook’s terms of use, the content can be deleted and a warning could be issued to the person who posted it. The system works and Facebook is dedicating staff from all over the world to keep the network safe.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

Here’s an interesting infographic that shows Facebook’s issue reporting flow at great detail (click for larger size):

It’s a pretty fascinating look at what lengths a company as large as Facebook must go through to insure that its users are safe. After all, if we’re interacting with friends, family members and loved ones on the service, we don’t want to be bothered with hacked accounts, spam, and porn.

This type of approach to community management, support and safety is key to keeping users on the site and happy. For a social service like Facebook, people are its most important asset.

The security team goes on to say:

The safety and security of the people who use our site is of paramount importance to everyone here at Facebook. We are all working tirelessly at iterating our reporting system to provide the best possible support for the people who use our site. While the complexity of our system may be bewildering we hope that this note and infographic has increased your understanding of our processes. And, even though we hope you don’t ever need to report content on Facebook, you will now know exactly what happened to that report and how it was routed.

If you think about it, this team is basically the online equivalent of your local police force. It’s a thankless job, but one that’s not only required, but appreciated once something bad happens.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with