Facebook revealed this week it’s trying to stem the flow of fake news by assigning trust values to users. It insists on keeping its criteria for trustworthiness secret though, in case untrustworthy people try to game the system — and they almost certainly will.
Tessa Lyon, Facebook‘s product manager, told The Washington Post a bit more about the system, in which the company uses several flags to identify which people on the site are more trustworthy than others. It rates users on a scale of zero to one. The only judging metric she would admit to is a person’s history of reports.
So if a person consistently flags a news source as fake when Facebook itself doesn’t judge the source to be untrustworthy, then it judges that person to be untrustworthy. Lyons implied the company takes this to mean the person reported the site out of an ideological disagreement: “I like to make the joke that, if people only reported things that were [actually] false, this job would be so easy! People often report things that they just disagree with.”
I don’t think for a second this is something Facebook just came up with — I would assume the primary reason it’s revealed this right now is because it’s trying to show it can “protect” us in the run-up to the 2018 midterm elections. Unfortunately, in order for it to work, Facebook can’t really reveal more than it already has — at least, according to Lyon. If Facebook were to tell you about the system by which it determines you can’t be trusted, then you’d be able to figure out to fix the system to make it think you can be trusted.
It’s just as often users who are untrustworthy as it is news organizations. Remember that event page for the incendiary counter-protest to the Unite the Right rally — the one later revealed to have been tainted by the stink of Russian interference? All of the legitimate protesters in that case were drawn into the event by ostensibly real people who assured them of good intentions. Even one of the event’s major organizers confessed he thought there was something funny about it — he told WaPo he was suspicious of the fact the person he spoke to not wanting to meet or speak over the phone — but went along with it because he was asked to by the other person.
But so much for judgement. What about solutions? Facebook began using a similar system to rank the trustworthiness of news organizations earlier this year. CEO Mark Zuckerberg told a group of reporters at the F8 conference he was working on a ranking system, and news ranked as untrustworthy would be less visible in news feeds.
Trouble is, he might not be able to implement the same solution to deal with untrustworthy users, or you run the risk of impeding their ability to use Facebook. The thought that Facebook might actually quash your speech because of an internal metric you don’t know anything about is more alarming than the idea they’re judging you in the first place.