It’s been two tumultuous years. Still, it’s highly unlikely we’ve witnessed the moment Facebook hit bottom. A new feature, spotted by some users on Monday, may offer a clue as to what that moment could look like.
In this case, it’s a tool that prompts users to comment on live video using algorithmically-generated text or emoji. The text, similar to that used in Google’s Gmail, attempts to match common replies to the context of a stream, and the sentiment surrounding it.
And it works, for the most part. Clicking the comment field accompanying live video allows you to select one of several heart-felt replies — so heart-felt, in fact, that Facebook actually took away the trouble of having to feel it at all — like “Hope everyone is OK” or “This is so sad.” Both, for those wondering, were offered as suggested replies to those watching news about yet another mass shooting, at a Chicago hospital this time. The shooting left at four dead, including a police officer responding to the scene.
So I’m just noticing that Facebook has a thoughts and prayers autoresponder on our Chicago Hospital shooting livestream and I have thoughts pic.twitter.com/8LQULnbQty
— Steph Haberman (@StephLauren) November 19, 2018
While automated responses aren’t exactly new, the sort of disconnect required to roll them out on a platform whose stated purpose is that of “connecting the world,” is cringeworthy. “Connecting the world through AI that requires little actual participation on your part,” granted, is a bit of a mouthful. Tone-deaf as all this may be it’s very much in line with what we’ve seen in the past several years at Facebook. In an effort to increase engagement on its live video platform, Facebook has, once again, failed to consider the consequences of its own actions.
If you’re expecting discourse, look elsewhere. This is, after all, the same social media platform that dangles the rage carrot in front of more than two billion monthly users to keep them clicking, sharing, and moving on the proverbial hamster wheel that powers Facebook; it’s a network that thrives on division, the kind that leads to the spread of fake news and a hardening of filter bubbles among all who use it; it’s a platform built from the ground up for the sole purpose not of connecting its users, but exploiting them.
The feature eschews empathy in favor of the sort of quick and thoughtless responses we’ve become accustomed to seeing pretty much everywhere we look online. (As if we need more of those.)
In today’s episode of Black Mirror, Facebook is now autosuggesting thoughts and prayers comments on live news videos https://t.co/4zeldd6csv
— Jon Passantino (@passantino) November 20, 2018
And though CEO Mark Zuckerberg will tell you that Facebook does a lot of good in the world — and arguably, it does — the consequences of its own existence might forever outweigh them.
As BuzzFeed News reported, the live streams featuring the new tool are often somber, including a video from Phoenix’s Fox affiliate about a sexual assault and possible shooting in a religious supply store. Suggested responses in that case were “respect” and “take care.”
Unlike “Smart Reply,” the automated response feature Gmail recently rolled out, users aren’t firing off responses about quarterly reports and whether you will or won’t be attending a PTA meeting. These are newsworthy storiesin many cases, with some being of a highly sensitive nature. The idea that we could adequately communicate empathy through canned messages is reprehensible.
Overall though, you have to admire Facebook’s consistency in bad decision making. Whether it’s coughing up mountains of data to Cambridge Analytica, running mood-control experiments in our News Feeds, or attempting to snare your children through a series of ever-more-addicting reward models, Facebook is nothing if not the model for indifference.
But maybe it’s really our fault. After all, we are the ones still riding the train that Zuck built when all evidence suggests we should have hopped off several stops ago. And it was Zuckerberg, you might remember, that once referred to us all as “dumb fucks” for boarding in the first place.
Update: A Facebook spokesperson confirmed the feature has been disabled, for now. “We have been testing a suggested comment feature on Live videos. Clearly this wasn’t implemented properly and we have disabled the feature for now.”
Get the TNW newsletter
Get the most important tech news in your inbox each week.