It’s almost as if Mark Zuckerberg could hear the sound of furious journalists pounding away on keys. Or, more likely, he read one of the countless pieces last week that sought to expose Facebook’s role in last week’s election. Either way, he’s responded to the media outcry that Facebook was directly responsible for Trump’s win.
I ranted about this very thing after the election last week, but it bears repeating: Facebook isn’t responsible for Trump’s ascension to power, at least not directly.
“We're hunting for awesome startups”
Run an early-stage company? We're inviting 250 to exhibit at TNW Conference and pitch on stage!
That’s not a fair burden to place on the shoulders of a platform meant to facilitate information sharing. If you’d like an easier argument to make, blame the algorithmic timeline that forces each of us into an echo chamber of similar world views. Blame the decision to cut human editors. Blame Facebook’s decision to allow pages that deliberately misrepresent a news story (or make one up entirely) to operate with impunity.
For every argument Zuckerberg makes about being a platform to promote free speech and open views, I can make another about conscious misrepresentation and the promotion of libelous material. And his argument that “truth” is complicated is falling on deaf ears.
This is an area where I believe we must proceed very carefully though. Identifying the “truth” is complicated. While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted.
He’s not wrong. Truth is a complicated matter and even major publications get it wrong from time to time. But we’re not seeking perfection. Something, anything, would be a step in the right direction. A simple law of averages approach would work wonders. Once a certain percentage of your content is deemed false, the Page is suspended. Keep it up and you’ll be deleted from the platform entirely.
If Zuckerberg can make the claim that “more than 99 percent of what people see [on Facebook] is authentic,” then it’s certainly not a stretch to assume he can apply the tools used to come up with that stat on a per-Page basis.
Arguing that Facebook isn’t at fault for giving a platform to the masses is fair. Arguing that it shoulders no responsibility in how they use it, well, isn’t.
It’s Facebook, after all, that turned a blind eye as alt-news sources built massive followings through advertising dollars and scandalous headlines. It again turned the other way when these Pages used their newfound soapbox to spread misinformation in the form of half-truths, or outright lies. The former is inevitable; it’s the latter Zuck and Co. should take issue with — yet until today, nothing.
Let’s be clear: people that share fake news are at fault for the majority of these issues.
Pages tailor their content strategy around what’s most likely to be shared, so the onus for misinformation falls directly in our lap. If we didn’t share the stories, Pages would stop creating them.
Where Facebook deserves criticism is for failing to do the one thing that would place the blame entirely on us: delete the Pages responsible for bogus information. This certainly wouldn’t end the problem — people still have the power to post fake news from outside Facebook onto their personal profiles — but it would quiet it considerably.
It’s also not without precedent.
Facebook has already taken a hard stance against affiliate marketers that use pages to attract an audience and profit from it via product sales or clickthroughs to the home page. For some reason, it doesn’t seem to mind when Pages are dressed up as news sources, even though it’s clear the latter could be far more damaging.
Sorry, Zuck, you’re wrong on this one. Facebook might have been created for a different purpose but it’s become the world’s largest source to share news. Like it or not, that carries with it a certain amount of responsibility to your users.