Facebook is testing a new feature that presents you with alternate perspectives before you read an article – and it might be its best attempt at combating fake news yet.
It’s doing so with a small addition to its ‘Related Articles’ feature. First introduced in 2013, the feature would pull up relevant content after you’ve read an article, in a bid to simply get you to read and share more news on Facebook. Today’s test instead shows links to other articles before you even click on anything.
Facebook doesn’t specifically say it’s looking to combat fake news this way, but it’s clear that’s on the company’s mind:
These additional articles, which appear for topics many people are talking about on Facebook, will appear in a unit below the link. That should provide people easier access to additional perspectives and information, including articles by third-party fact-checkers.
It’s a subtle change, but one that could have a real impact. Previous efforts have focused on combating outright fake news by flagging false stories (which often takes too long) and teaching people to spot obviously sketchy pieces (which is putting a lot of faith in users). But the real danger is in pieces that aren’t necessarily completely fake; sometimes articles simply exaggerate the truth or are dripping with so much bias that they sway opinions.
By showing the links before you click through to an article, Facebook is able to ensure a couple of things:
- That you are actually aware of different perspectives before you’re influenced by whatever *insert political leaning here* publications you typically peruse. Heck, you might even end up reading an opposing perspective before you read the piece you meant to; It could add a dose of conservative viewpoints to a liberal user and vice versa.
- That you also see articles from reputable sources, instead of just sensationalist ones.
- That Facebook doesn’t have to wait for a third-party fact-checker (which can sometime take days) before providing counterpoints to a particular article, helping to nip fake news at the bud.
Basically, Facebook is trying to pop your filter bubble. If all you read are sensationalist alt-center posts by Illuminati worshippers, Facebook wants to make sure you also see news that is a bit more… reasonable. It’s a more sensible approach to tackling fake news and balancing bias than simply adding a big red ‘fake news’ label. This way, users can determine what’s fake on their own.
Of course, it’s still up to the users to put the effort to read opposing viewpoints – some users simply might not want to – but at least Facebook can absolve itself of some of the blame. It still has a long way to go – and its worth reiterating that this is just a test for now – but it’s a step in the right direction.