Facebook is remarkably adept at showing each of us the content we care about. But is it really the content we need to see?
By now, we’re all familiar with the term echo chamber — the phenomenon of being shown only content that matches our pre-conceived notions — but the solution is evasive. Netflix, as it turns out, might hold the key to the exit door of each of our echo chamber’s. But is Facebook willing to build on lessons learned by another platform? Is it even capable of a bold solution at this point?
Netflix, we have a problem
Netflix once had a similar problem to Facebook’s so-called echo chamber.
For years, the company added titles to a growing library that, at one point, topped 7,000 pieces of unique content. You can probably already see the problem; how were people going to find it?
Netflix found that the average user behavior was somewhat predictable: they’d watch the same few movies, television shows, and documentaries and rarely escape their content subset. Rarely, if ever, would they venture outside and look for new content on the platform.
It had a problem, but also an ingenious solution: find similarities in content and begin suggesting similar pieces that expand the bubble we all live in without popping it.
As different as we all are, it’s not difficult to find areas of overlap in which we display similar behavior.
Using a made-up scenario, it’s the same reason Netflix might entice Star Trek fans into watching a movie like Spaceballs. The two shared the common theme of space travel, but aside from that couldn’t be more different. Netflix was, as it turns out, slowly and imperceptibly pushing our boundaries for what we deemed acceptable and opening the door to entirely new worlds of content.
Its solution could work for Facebook too.
Facebook’s fix
Echo chambers stop us from talking to people with opposing viewpoints.
The reason is as much in our DNA as it is in modern interaction, as researchers once discovered by using an MRI machine. Researchers discovered that when people started arguing on social media, the brain’s logic center switched off. Instead, test subjects were flooded with activity in the same region of the brain commonly associated with the fight-or-flight response.
Put simply, they were ready to flee or fight based on a differing opinion on social media.
Claire Woodcock, a data scientist and author of an excellent talk on the subject at SXSW today says: “It’s kind of tribal now as if we’d picked a sports team. The truth doesn’t matter anymore, what matters is being right.”
If Facebook were to employ Netflix’s strategy, it would involve using its mountain of data to look for similarities among groups of people, rather than pointing us to content similar to what we read previously. In another made-up example, perhaps Facebook shows a conservative story that appeals to a liberal audience. Instead of “Mexican’s are coming for our jobs” they could find a story with a similar meaning, but a lighter angle, such as: “What makes crossing the border so appealing for Mexican’s?”
Like Netflix, Facebook would identify and subtly push the line, thus opening our mind, over time, to opposing viewpoints. In the above example, both pieces deliver what amounts to the same message, but liberals have a higher likelihood of engaging with the latter. You’ve delivered an opposing viewpoint without directly challenging long-held beliefs or triggering the fight-or-flight response that comes from viewpoints opposite of ours.
It’s a peek into the conservative psyche, without threatening liberal ideals. Over time, this peek could turn into longer looks. Eventually, our bubble will grow to include opposing viewpoints — or piss everyone off and force them to leave Facebook.
But it worked out okay for Netflix.
h/t Claire Woodcock
Get the TNW newsletter
Get the most important tech news in your inbox each week.