During Facebook’s quarterly earnings call today, Mark Zuckerberg highlighted some major changes coming to Facebook. After years of being the primary news source for many people, Facebook has decided it wants to show less political content.
According to Zuckerberg, “there has been a trend across society that a lot of things have become politicized and politics have had a way of creeping into everything.” He noted “one of the top pieces of feedback” that Facebook is hearing is that “people don’t want politics and fighting to take over their experience.”
To mitigate so the company is looking to reduce the amount of political content that shows up on people’s feeds. The CEO didn’t go into detail about how exactly Facebook would do so but said the change is a continuation of work Facebook has been “doing for a while to turn down the temperature and discourage divisive conversations.
To that end, the company has also decided to permanently stop recommending political groups for users to join:
“We have to balance this carefully because we do have deep commitment to free expression. If people want to discuss [politics] or join those groups, they should be able to do that. But we are not serving community well to be recommending that content right now.”
The move follows a temporary pause in such recommendations ahead of the US elections.
My take: One could argue that this is a way for Facebook to shed responsibility, but at the same time, I could certainly do with fewer political arguments on social media. I prefer getting my news from authoritative sources, and when all people see is a headline and a blurb on social media — we’ve seen that plenty of people don’t actually read articles they share — it’s easy to get lost in the noise.
Presumably, you’ll still be able to see news from pages you’re subscribed to, but the company could cut down on how often this content shows up on your feed. It’s also possible Facebook will reduce the visibility of political content that people share themselves, which could help prevent the spread of misinformation.
That said, I suspect it won’t be long before right-wing groups attack such a change as a form of conservative censorship, even though at least some evidence so far suggests Facebook’s algorithms have so far favored right-wing sources.
Get the TNW newsletter
Get the most important tech news in your inbox each week.Follow @thenextweb