Of all the things a developer shouldn’t do, creating resistance for users is one of them. According to a new report, Facebook did just that to test user loyalty.
From The Information:
Facebook has tested the loyalty and patience of Android users by secretly introducing artificial errors that would automatically crash the app for hours at a time, says one person familiar with the one-time experiment. The purpose of the test, which happened several years ago, was to see at what threshold would a person ditch the Facebook app altogether.
Blockchain and cryptocurrency news minus the bullshit.
Visit Hard Fork.
To be fair, Facebook’s efforts here are reportedly in anticipation of a war with Google. The Information says the relationship between the two companies has been strained for years, and Facebook is entertaining the idea of sidestepping Google’s Play Store (and other services, like Maps) to distribute their services.
My real hangup here is that Facebook didn’t let users in that “small country” opt in or out of its testing. A company the size of Facebook could also eat their own dog food; there’s no reason to blindly subject loyal users to such experiments. It could have done the same when creating its artificial errors.
Sadly, this isn’t new for Facebook. In 2014, the company artificially manipulated the News Feed for some users. The alleged aim at that time was to see if a more pleasant News Feed caused users to create more pleasant posts.
It’s hard to know what to make of Facebook in these instances. A simple opt-in — even a blind one where the company didn’t tell users what it was testing — would have been much better form, here.
There’s also enough evidence to wonder if Facebook’s games are ongoing. We know it tests features regularly without notifying users, but is it still forcing users to jump through unnecessary hoops because it and Google can’t play nice?
Facebook is right to imagine a time when it doesn’t need to do business through Google’s Play Store (if the two companies are indeed having issues with one another), but forcing users into compromised positions is disgusting. We’re encouraged to wrap our lives up into Facebook, and forcing users into these experiments dissolves trust.
If you ever trusted Facebook at all, that is.