Buzzfeed revealed earlier this week Facebook’s latest solution to the problem of trustworthy news was a user survey comprised of two questions: “Do you recognize the following websites” and “How much do you trust each of these domains?”
At first, this sounds too simplistic — but think about it for longer than it takes to take the survey, and it looks more useful than it sounds at first.
Mark Zuckerberg last week announced that Facebook would prioritize news sources by trustworthiness, and that the question of who to trust would be in the hands of the users:
We could try to make that decision ourselves, but that’s not something we’re comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you — the community — and have your feedback determine the ranking. We decided that having the community determine which sources are broadly trusted would be most objective.
That, in and of itself, drew some derision. Facebook’s users play a large part in the dissemination of unreliable news, and now supposedly they’d be the ones deciding which sources were trustworthy? The reactions were even more concerned when it the number of questions was revealed. Even Buzzfeed made a point of saying it didn’t find the survey “nuanced.”
Still, it might be more insightful than it appears on first glance. According to Adam Mosseri, Facebook’s head of News Feed, the survey is not going to be sent to all Facebook users, but rather a sample of Facebook users.
It’s worth noting this isn’t a rating system, nobody can opt into rating a publisher as trustworthy. We randomly sample new people each day, and only their responses are used. I’m sure some bad actors will try and game the system, but it’s not as easy as you suggest.
— Adam Mosseri (@mosseri) January 21, 2018
The surveys will be re-run every day with a different set of people. It’s not something you can just opt into, meaning it’ll be harder for people to game the system without some real effort. Mosseri also clarified the sample set would be people of varied reading habits.
The headline is misleading. We ask people what they trust, but don’t simply value more the publications that get positive replies. We specifically look for publications that are trusted by people with a wide range of reading habits, so trusted by many different types of people.
— Adam Mosseri (@mosseri) January 20, 2018
The survey will also be one element out of many that determines trustworthiness. Trust scores will play a part, but won’t be the be-all, end-all for publishers.
If you think about it, what further questions need to be asked? If you ask a certain number of people, all of whom you know read different things, if they trust TNW, and they all say yes, it’s a good indicator that the site engenders trust across lines.
On the surface, a short survey of the crowd doesn’t appear to be the smartest way to determine something as nuanced as editorial integrity. But the survey engages Facebook users enough to make sure communal opinion plays a part, but not so much that it seems to be leaving it to the crowd entirely.
Is this likely to eliminate Facebook’s ongoing migraine over news? Nope, but it’s a step in the right direction.