A leaked manual intended for Facebook employees working with its trending news module shows that while algorithms play a big part in the social network’s curation of events, it’s the human editors that have the final say.
Published today by The Guardian, the document also says that human editors can inject topics into the trending bar manually to either replace another topic or to add a new one. The guidelines state that editors can only manually add topics that are already appearing in its ‘review’ and ‘demo’ tools, even if it’s deemed newsworthy.
Do you want to be a cryptocurrency millionaire?
Don't get your hopes up.
The surfacing of the document is sure to add fuel to accusations that Facebook deliberately manipulated conservative news agendas in the US and seems to sit contrary to comments made by Tom Stocky, Vice President of Search at Facebook, two days ago, who said:
There have been other anonymous allegations — for instance that we artificially forced #BlackLivesMatter to trend. We looked into that charge and found that it is untrue. We do not insert stories artificially into trending topics, and do not instruct our reviewers to do so.
The manual also purportedly reveals that Facebook has a perhaps surprisingly traditional structure to its editorial operations and that it checks potential stories against a relatively small number of sources to determine whether a trending story has any editorial authority. In reality, that list is just 10 titles: BBC News, CNN, Fox News, The Guardian, NBC News, The New York Times, USA Today, The Wall Street Journal, Washington Post, Yahoo News or Yahoo.
If it’s a leading item on at least five of those sources, it’s deemed potentially editorially important to Facebook.
For a platform with a reach of 1.6 billion monthly active users, that feels like a small scope to be looking through.
We’ve contacted Facebook for comment and will update if we hear back.
Update, May 12: Facebook’s VP of Global Operations has issued a statement:
The guidelines demonstrate that we have a series of checks and balances in place to help surface the most important popular stories, regardless of where they fall on the ideological spectrum. Facebook does not allow or advise our reviewers to systematically discriminate against sources of any political origin, period. What these guidelines show is that we’ve approached this responsibly and with the goal of creating a high-quality product — in the hopes of delivering a meaningful experience for the people who use our service.Trending Topics uses a variety of mechanisms to help surface events and topics that are happening in the real world. In our guidelines, we rely on more than a thousand sources of news — from around the world, and of all sizes and viewpoints — to help verify and characterize world events and what people are talking about. The intent of verifying against news outlets is to surface topics that are meaningful to people and newsworthy. We have at no time sought to weight any one view point over another, and in fact our guidelines are designed with the intent to make sure we do not do so.