Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on October 1, 2019

Zuck says Facebook will never curb offensive content — but vows to protect its moderators


Zuck says Facebook will never curb offensive content — but vows to protect its moderators

For the past five years, we’ve heard countless reports on how hard content moderation really is, and moderators at social networking giant Facebook could have it the toughest. Earlier this year, The Verge’s Casey Newton published an article titled “The Trauma Floor” which revealed the secret lives of Facebook moderators in America, and more specifically, the harrowing conditions these employees were tasked with on a daily basis. 

In a two-hour leaked audio recording obtained by The Verge, Mark Zuckerberg reveals what the company’s future plans are in regards to protecting its 30,000 content moderators and how it plans to harness technology to curb the issue of harmful content online. 

When Zuckerberg was asked about The Verge’s stories on content moderation, and how damaging it is to employees’ mental health, the Facebook CEO dismissed them as “a little overdramatic.” Zuckerberg claimed that after “digging into them [reports] and understanding what’s going on, it’s not that most people are just looking at terrible things all day long.”

As you can expect of a platform with almost 2.1 billion active daily users, there’s millions of posts uploaded to the social network every hour — including some of a violent and disturbing nature. Ultimately, these are the kind of posts that Facebook is working to stop from showing on your feed, but for this to happen someone has to view it.

Back in 2014, a report by Wired’s Adrian Chen found that contractors working on behalf of US-based companies like Facebook were instructed to look through hundreds of posts a day, and to keep an eye out for content depicting “pornography, gore, minors, sexual solicitation, sexual body parts/images, and racism,” so it could be taken down swiftly.

Because of the nature of this work, it’s no surprise that some workers develop post-traumatic stress disorder, and some start to believe the conspiracy theories they hear about — all while being paid just above minimum wage

Zuckerberg admits “there are really bad things that people have to deal with,” and in his answer, he provided some concrete steps of how Facebook protects its content moderators. The tech giant ensures its employees receive the right support and have easy access to mental health resources and counseling. “It’s something we’ve worked on for years and are always trying to probe and understand how we can do a better job to support that,” Zuckerberg added. 

During the audio clip, we hear Mike Schroepfer, Facebook’s CTO, talk about how content moderation is “a key area of focus for the product and engineering teams who are building all the tools and technology.”

Some of the technology includes improving the near-duplicate detection tool which automatically detects offensive content without having to view it. Also, improving tools that automatically blur parts of an image, transform pictures to black and white, and blur out faces in images. 

Although Facebook is developing tools to combat problematic content found on the site, Zuckerberg admits that the company is “not going to eliminate it completely.”

Its priority is to ensure content moderators have the support required, especially those who are having the worst experiences. However, Zuckerberg explained how the overall issue of harmful online content is an ongoing thing — it’s also another issue that may never be eliminated completely by tech. 

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with