Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on May 27, 2020

Facebook reportedly knew its algorithms promoted extremist groups, but did nothing


Facebook reportedly knew its algorithms promoted extremist groups, but did nothing Image by: Anthony Quintano

Facebook has long struggled with controlling extremist content on its platform. From the 2016 US elections, when Russians were able to manipulate American voters through polarizing ads, to propaganda that spread through the social network and led to violence in Myanmar.

A new report by Jeff Horwitz and Deepa Seetharaman in the Wall Street Journal suggests that Facebook knew that its algorithm was dividing people, but did very little to address the problem. It noted that one of the company’s internal presentations from 2018 illustrated how Facebook’s algorithm aggravated polarizing behavior in some cases.

A slide from that presentation said if these algorithms are left unchecked they would feed users more divisive content:

Our algorithms exploit the human brain’s attraction to divisiveness. If left unchecked, Facebook would feed users more and more divisive content in an effort to gain user attention & increase time on the platform.

According to WSJ, Zuckerberg & Co. shelved this presentation and decided not to apply its observations to any of the social network’s products. Plus, Joel Kaplan, Facebook’s chief of policy, thought these changes might have affected conservative users and publications.

Joel Kaplan, Facebook's chief of policy
Credit: Facebook
Joel Kaplan, Facebook’s chief of policy

In a statement, Facebook said it has learned a lot since 2016 and has built a robust integrity team to tackle such issues:

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

We’ve learned a lot since 2016 and are not the same company today. We’ve built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform’s impact on society so we continue to improve.

However, WSJ’s report noted that even before the company formed this team, a Facebook researcher named Monica Lee found in 2016 that “64% of all extremist group joins are due to our recommendation tools.”

Facebook even sought to tackle the polarization problem with proposed ideas such as tweaking its algorithm, and temporary sub-groups to host heated discussions. However, these concepts were shot down because of they were “antigrowth.”

In the end, the social network didn’t do much, in favor of upholding the principle of free speech — a value that Zuckerberg has talked about a lot lately.

Earlier this month, Facebook named its Oversight Boardits Supreme Court, if you will, which can overrule the social network’s decision on content moderation. Hopefully, the company will be forthcoming in sharing its research and learnings to the board, and not wait for someone to report glaring problems with its products first.

You can read WSJ’s full report on Facebook’s divisive algorithms and its internal studies here.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with