This article was published on October 28, 2021

The Facebook Papers: All the major revelations in a handy list

Political extremism, ethnic cleansing, and rampant misinformation


The Facebook Papers: All the major revelations in a handy list Image by: Pexels

Facebook has had many bad days and months, and the company is currently facing yet another public disgrace.

Everyone’s talking about the Facebook Papers, and we’re here to summarize them for you, so you don’t have to spend days of your life reading them. Let’s dive right into them.

What are the Facebook Papers?

The Facebook Papers are a set of documents that former Facebook employee turned whistleblower, Frances Haugen, obtained before leaving the company. She submitted these documents to the Security Exchanges Commission (SEC) earlier this year, and now they are available to a consortium of news outlets.

Who’s Frances Haugen and what does she think about Facebook?

Haugen, 37, was a product manager at Facebook and a part of the Civic Integrity group, which worked on risks to elections including misinformation and bot accounts. She left the company in May, but collected a trove of information that is now called the Facebook Papers.

In her interview with the show 60 Minutes, she said, “Facebook over and over again, has shown it chooses profit over safety.” You can read more about it here.

The Facebook Papers have revealed a lot of information about what the company thinks about and how it deals with dwindling user numbers, misinformation, and its own image.

Below are some of the big problem areas for Facebook. We’ll keep updating the list as new details emerge.

Facebook, teens, and mental health

In the past decade, the number of teenage users on Facebook has steadily decreased. According to a report from Bloomberg, a recent internal study revealed that time spent by teens on the platform has declined 16% year-on-year. Facebook’s more popular among baby boomers these days.

Fewer teens are signing up for the services. That’s not surprising given the growth of other platforms such as TikTok.

A report from the Wall Street Journal published in September said that the company ignored Instagram’s harmful impact on teens, particularly in regard to self-esteem and body image.

After these reports, Nick Clegg, Facebook’s VP of global affairs, went on to defend the firm and said that it’s working on a slew of new features — including a ‘take a break’ warning — for teen safety.

Facebook has failed to moderate hateful content in different countries

While Facebook’s base lies in the US, its biggest audiences are in countries like India and Brazil. As Casey Newton of The Platformer noted, these countries are placed in ‘tier zero’ — meaning they are a high-priority for its Civic Group formed in 2019 to monitor for election interference.

But that doesn’t guarantee success. A New York Times article showed that the company struggles with problems like hate speech, misinformation, and the celebration of violence in India.

The report noted that the infestation of bots and fake accounts had a massive impact on the country’s national elections held in 2019.

A lack of budget allocation where it’s needed

The NYT report said that, shockingly, 87% of the company’s budget to combat misinformation is allocated to the US, while the rest of the world has to make do with 13%.

This creates a massive resource crunch for monitoring countries that communicate in languages other than English.

In countries like Myanmar, which held its national elections last November, Facebook deployed tools to fight fake news, but they were ineffective.

The situation in other countries, which are slotted in tier three according to The Platformer, is more dire. In Ethiopia, despite knowing that Facebook is being used for violence, the company did little to stop it.

Facebook’s algorithm mishap

In her revelations, Haugen said that Facebook’s 2018 algorithm change was the culprit for inciting hate between its users. When it was rolled out, Zuckerberg said that it meant to increase interactions between friends and family.

However, this change backfired, as the feeds became angrier, resulting in increased toxicity and misinformation.

Make positive stories visible

As a way to combat its algorithm failures and improve its image, Facebook began to increase the visibility of positive stories about itself after a meeting in January. Plus, the social network began to distance Zuckerberg from controversial topics such as vaccine misinformation.

In 2018, the company even ran an experiment of turning off its News Feed algorithm for select people. That led to worse experiences for many people, with less engagement and more ads.

Employees are unhappy with the direction

Haugen is probably one of the pristine examples of how Facebook employees are not fine with the way the social network is operating. In her interview, she said that the situation in the firm was substantially worse than “anything I’d seen before.”

A report from the Wall Street Journal noted that the firm ignored employees’ warnings about pages and groups run by global drug cartels and human traffickers.

“We’re FB, not some naive startup. With the unprecedented resources we have, we should do better,” said an employee after the Capitol Riots in the US in January, as per a report by Politico.

The story has numerous quotes from former and current Facebookers, who were angry about the way the firm was handling misinformation across the board.

A report from Wired echoed what some former employees have said before: the company focuses on engagement obsessively.

It also highlighted that the critical teams that engage with misinformation directly, such as the content policy team, don’t have the power of other units, like the public policy team, to carry out swift actions.

Zuckerberg’s response

While the company agreed earlier this year to keep King Zuck out of sight for controversial issues, this is too big to ignore.

In Facebook’s latest quarterly earnings call, the head of the company addressed this issue and said the media is misinformed about the firm:

Good faith criticism helps us get better, but my view is that we are seeing a coordinated effort to selectively use leaked documents to paint a false picture of our company.

Want to find out more?

This is by no means an exhaustive list of all the issues uncovered by the Facebook Papers. So here’s a reading list for you to dive deeper into it:

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top