The heart of tech is coming to the heart of the Mediterranean. Join TNW in València this March 🇪🇸

This article was published on May 24, 2016

Facebook says its Trending Topics picks weren’t biased, but is fixing its policies anyway

Facebook says its Trending Topics picks weren’t biased, but is fixing its policies anyway

Following accusations of deliberately manipulating its Trending Topics section to suppress content from conservative media, it came to light that Facebook did have humans working on the feature, and even issued guidelines on how they could manually add topics to the list.

On May 10, Senator John Thune published an open letter (PDF) addressed to Facebook, demanding an explanation of how it handled news articles in its ‘Trending’ list. The company has now responded after launching an investigation into the matter, saying it’s found no indication of systemic bias in its policies:

Our investigation has revealed no evidence of systematic political bias in the selection or prominence of stories included in the Trending Topics feature. Our data analysis indicated that conservative and liberal topics are approved as trending topics at virtually identical rates. We were also unable to substantiate any of the specific allegations of politically-motivated suppression of particular subjects or sources. In fact, we confirmed that most of the subjects mentioned in media reports were included as trending topics on multiple occasions.

However, the social network noted that it couldn’t entirely rule out the possibility of some bad apples applying biases of their own.

At the same time, our investigation could not fully exclude the possibility of isolated improper actions or unintentional bias in the implementation of our guidelines or policies.

In an effort to fix this, Facebook is amending its policies and introducing additional controls and oversight around the review team.

It will also “no longer rely on lists of external websites and news outlets to identify, validate or assess the importance of particular topics.”

Until now, the company used an RSS list of more than 1,000 news publishers to help it spot important stories. If a story appeared in more than three sites, then Facebook staff did not have to describe the story without attribution to an individual source. The updated policy will see this list discarded.

In addition, it will remove “the ability to assign an “importance level” to a topic through assessment of the topic’s prominence on the top-10 list of news outlets.”

It will also retrain its human reviewers to emphasize that content decisions may not be made on the basis of politics or ideology.

For his part, Senator Thune responded in earnest, acknowledging Facebook’s resolve to look into the matter:

While the committee remains open to new information on this matter, transparency – not regulation – remains the goal, so I thank the company for its efforts to acknowledge relevant facts and its recognition of a continuing need to transparently address relevant user questions.

Zuckerberg and Co. may have dodged a bullet there – but the social network will have to remain vigilant to avoid this sort of mishap in the future, as it continues to grow its user base and influence over the content it exposes people to.