Save over 40% when you secure your tickets today to TNW Conference 💥 Prices will increase on November 22 →

This article was published on December 14, 2018

‘They clearly don’t care’: Fact-checkers on the state of misinformation at Facebook


‘They clearly don’t care’: Fact-checkers on the state of misinformation at Facebook

After a deluge of fake news began to overwhelm Facebook, it got proactive. The social media platform tapped bi-partisan organizations, such as Snopes, to fact-check content on the platform and use their expertise to combat the spread of misinformation.

Or, that’s what we were led to believe, anyway.

“They’ve essentially used us for crisis PR,” said Brooke Binkowski, former managing editor at Snopes, a fact-checking site that partnered with Facebook for two years. “They’re not taking anything seriously. They are more interested in making themselves look good and passing the buck … They clearly don’t care,” she told The Guardian.

After a deluge of fake news began to overwhelm Facebook, the company, in an effort to finally do something about it, partnered with third-party fact-checking organizations, like Snopes, to stop the spread of misinformation. Or, that’s what we were led to believe, anyway.

“They’ve essentially used us for crisis PR,” said Brooke Binkowski, former managing editor at Snopes, a Facebook fact-checking partner. “They’re not taking anything seriously. They are more interested in making themselves look good and passing the buck … They clearly don’t care,” she told The Guardian.

Binkowski said that on at least one occasion, Facebook encouraged fact-checkers to prioritize the debunking of information that hurt its advertisers.

Facebook denied the allegations today in a blog post. The company also touted the effectiveness of its misinformation initiatives, stating:

Fact-checking is highly effective in fighting misinformation: when something is rated “false” by a fact-checker, we’re able to reduce future impressions of that content by an average of 80%. We also leverage these ratings to take action on Pages and websites that repeatedly share misinformation. We de-prioritize all content from actors who repeatedly get “false” ratings on content they share, and we remove their advertising and monetization rights.

Facebook first started partnering with news outlets after the 2016 presidential election, an election that made “fake news” a part of the worldwide lexicon. In 2016, in fact, the 20 top-performing fake election stories out-performed real ones by over a million engagements, 8.7 million shares, reactions, and comments, as opposed to 7.3 million from legitimate media outlets.

To combat this, Facebook began partnerships with more than 40 news organizations, trusting the likes of the Associated Press, PolitiFact, and Snopes with ridding the site of misinformation.

At the time, most considered it a positive move, a step in the right direction for a social platform infiltrated by Russian operatives during the 2016 election.

But now, these same publications are looking for a way out.

“Why should we trust Facebook when it’s pushing the same rumors that its own fact checkers are calling fake news,” asked one current Facebook fact-checker, who asked The Guardian not to be named. “It’s worth asking how do they treat stories about George Soros on the platform knowing they specifically pay people to try to link political enemies to him?”

It’s a fair question following recent allegations in the New York Times that COO Sheryl Sandberg once hired a consulting company to dig up dirt on Facebook’s most vocal critics. Billionaire philanthropist George Soros was one of the targets of Facebook’s opposition research.

“Working with Facebook makes us look bad,” added The Guardian’s anonymous source. The fact-checker has advocated for an end to the partnership.

The sentiment is shared among many of the fact-checkers Facebook currently employs, according to the report. Many cite the platform’s slow, if non-existent reaction to legitimate threats. Others believe the company is ignoring the problem, seeking instead to manage public perception rather than seeking to combat misinformation.

Binkowski believes all of these things, adding “I strongly believe that they are spreading fake news on behalf of hostile foreign powers and authoritarian governments as part of their business model.”

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top