This article was published on December 6, 2017

Facebook won the war for your mind. Now it wants your children.


Facebook won the war for your mind. Now it wants your children.

Fashioning a noose from a scarf, 14-year-old Naika Venant ignored the pleas of Facebook Live viewers as she wrapped the makeshift hanging device from a shower rod, ending her own life. When the stream ended just over an hour later, all that remained was a lifeless body police found in the bathroom just feet from her sleeping foster parents.

Venant’s story is tragic, as are those of Ayhan Uzun, Katelyn Nicole Davis, Frederick Jay Bowdy, and countless others who attempted, or successfully committed suicide in front of a live audience.

Facebook denies any culpability. The social network promises it’s taking the situation seriously and pledges to hire more people. It’s also deploying AI created to seek out signs of dangerous behavior. Through technology or perhaps sheer force of will, Facebook plans to scour the platform’s dark corners — places known to house communities of pedophiles, rapists, and terrorists — in an attempt to prevent future loss of life.

In the mean time, families of the deceased would undoubtedly settle for eradicating the evidence of these macabre videos that still pop up from time to time, months later.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

As with all suicides, extenuating circumstances are often at play. Mental illness, significant loss, or abuse are common themes that drive otherwise happy people to do the unthinkable. Facebook, to be clear, isn’t responsible for an individual in need of help that decides to take her own life.

But it’s certainly not helping.

Facebook and the fight for kids’ attention

Minutes after announcing a new app for children, you could feel the collective ire of the internet. Messenger Kids, set to debut December 11, is a stripped version of Facebook’s primary chat app. Unlike the version adults use, this one is built for those 13-and-under, kids who can’t, legally-speaking, create their own accounts.

At its surface, the app is innocent enough. Kids can send photos, videos, and instant messages (presumably) to other kids on the network. They can draw on their creations, add stickers, and enjoy most of what they’re already doing (illegally) on apps like Snapchat, Facebook, and Instagram — all of which have explicit language prohibiting children under 13 from using them.

Language though, offers deniability.

Banning children under 13 from Facebook isn’t an attempt at taking the moral high ground. According to the Children’s Online Protection Act (COPPA), any website collecting information about its users is prohibited to do so with kids 13 and younger.

CEO Mark Zuckerberg, for what it’s worth, has continually spoken out against this sort of regulation. He once told CNN that “[dropping the minimum age] is a fight we’ll take on at some point.”

He later added:

My philosophy is that for education you need to start at a really, really young age. Because of the restrictions we haven’t even begun this learning process. If they’re lifted then we’d start to learn what works. We’d take a lot of precautions to make sure that [younger children] are safe.

You don’t have to be a cynic to realize Zuckerberg isn’t a great choice to babysit these kids.

Embrace the future

While Facebook has cemented itself as a digital titan in our current ecosystem, the future looks to be a very different place. For all the data Facebook collects on its users, there’s only one metric that matters to the continued viability of the world’s largest social network: its inability to attract teens.

While Blockbuster, Borders, and other mega companies once scoffed at those who foretold of their demise, Facebook is facing its uncertain future head on. It knows that without an influx of young users, its prospects for the future are rather bleak.

By indoctrinating those under 13 into the Facebook ecosystem through use of its new app, the social network hopes to groom the next generation.

We’ve seen this before, but this time its worse.

Taking a page from the playbook of beer and cigarette companies in the 80s and 90s, Facebook’s hope is that attracting kids will lead to brand loyalty as they mature. Or, perhaps it doesn’t care. Advertising to children is big business on its own, but the move pays double if it can create the sort of loyalty that keeps them around into adulthood.

Joe Camel, the illustrated spokesperson for Camel cigarettes reigned supreme for nearly a decade before legislators ultimately decided the cartoon nature of these, and other advertisements, constituted advertising a dangerous product to children.

The Camel wasn’t alone. Spuds MacKenzie started the child advertising debate almost a decade prior. Spuds, Budweiser’s bull terrier mascot, was retired in 1989 due to pressure from The Center for Science in the Public Interest, Mothers Against Drunk Driving, and Sen. Strom Thurmond. The dog was later replaced with a trio of talking frogs, which met with the same fate. They, interestingly enough, were replaced with three lizards: because frogs are for children and lizards are for adults, obviously.

But while its motivations are obvious, Facebook’s newfound obsession with attracting an ever-younger audience is disturbing in ways that cigarette and beer companies couldn’t have even begun to understand. Rather than vying for transactional value, such as the purchase of Budweiser at the local grocery store or cigarettes at the bodega, Facebook isn’t attempting to sell your kids a dangerous thing; It’s attempting to sell them all the dangerous things, including the platform itself.

You’re the product

To detail all of Facebook’s criticisms, one would need a book deal and a few hundred blank pages — although, this is a good start. For the sake of brevity, we’re just going deal in what’s relevant when trying to understand why your children (and their data) will never be safe in the hands of Uncle Zuck.

After all, you can’t talk about Facebook without first mentioning the blatant disregard for its users. As the saying goes, “If you’re not paying for it, you’re not the customer; you’re the product.”

It may seem like a worthy trade: we suffer through a few ads to stay connected with family, watch viral videos, scope the latest memes, and rant about the customer service at that burrito joint around the corner. Advertisers get value from the network’s two-plus billion users while Facebook gets to keep the lights on and outfit Zuck in a never-ending supply of gray t-shirts. Facebook’s ad business isn’t appealing because of the size of its colossal userbase, but what the network knows about each of them — information it shares with those who purchase ads.

And, to put it mildly, Facebook knows a lot.

Even this, in and of itself, isn’t that bad. We willfully hand over this information; Zuckerberg just had the foresight to collect it and bundle it in a package advertisers were willing to spend money on.

Where the company runs foul is in how it uses this data.

There’s fake news. That’s an obvious place to start. While Facebook isn’t creating the news, it’s reaching those most vulnerable to it by tapping the insights afforded to those pedaling in this kind of garbage. It’s creating a new generation of non-thinkers, those who are significantly more likely to share a status update they agree with rather than one rooted in truth.

Or there’s a ProPublica report accusing Facebook of allowing discriminatory housing ads. After attempting to buy dozens of rental housing ads on the network, ProPublica asked that they not be shown to “African Americans, mothers of high school kids, people interested in wheelchair ramps, Jews, expats from Argentina, and Spanish speakers.”

All but one of the ads were approved within minutes. The one that wasn’t approved immediately sought to block the ads from those “interested in Islam, Sunni Islam, and Shia Islam.” That ad took 22 minutes to approve.

We raised similar questions with this practice ourselves, questioning the morality of displaying different versions of the “Straight Outta Compton” trailer to audiences of different races and ethnicities.

Worse, it’s still going on.

And then there’s arguably the most egregious thing Facebook has ever done with our data.

In a now infamous 2014 emotional manipulation study, the Zuckerberg brain trust set out to see if it could manipulate users by carefully determining what to place in their Newsfeed. Titled “Experimental evidence of massive-scale emotional contagion through social networks,” the study never asked for the consent of the more than 600,000 users whose moods it attempted to alter (both positively and negatively) by controlling what they were allowed to see while logged in.

Presumably, this isn’t the sort of information you’d willingly hand over about your kids. Yet, we’re doing just that. Worse, we’re allowing Facebook to parent our children and influence the way they see the world — for better or worse.

Its best feature may be the ‘log out’ link

Humans, being social beings, thrive when they have strong, positive relationships with others. The idea that Facebook can connect us with more people, therefore increasing our feelings of inter-connectivity and thus producing happier humans is one way of looking at it. On paper, it checks all the boxes.

In practice, nothing could be further from the truth.

Research shows Facebook may have a detrimental value to human existence. It detracts from face-to-face relationships, increases sedentary behavior, erodes self-esteem, and could compound the affects of addiction. It creates feelings of envy, increased instances of stalking, divorce, and depression, and has a negative impact on both sleep and academic performance in students.

According to Harvard Business Review, speaking on the results of a Harvard study:

Overall, our results showed that, while real-world social networks were positively associated with overall well-being, the use of Facebook was negatively associated with overall well-being. These results were particularly strong for mental health; most measures of Facebook use in one year predicted a decrease in mental health in a later year. We found consistently that both liking others’ content and clicking links significantly predicted a subsequent reduction in self-reported physical health, mental health, and life satisfaction.

Facebook isn’t an inherently evil company — at least we hope it isn’t. But playing fast and loose with its users’ emotions has severe consequences, consequences Facebook doesn’t seem to consider when finding newer, more pervasive ways to keep us engaged, mental health be damned.

Like all social networks, the goal is to keep users on the site and digging through content. The longer they keep us on the site (or app) the more ads they can force feed using intelligence we’re willfully giving away while consuming saidcontent. In this never-ending feedback loop social networks use gathered intelligence and industry insight to design platforms in a way that exploit the mind’s vulnerabilities.

Social networks are actively exploiting our weaknesses as a profit model meant to draw users ever-deeper into an ecosystem that’s slowly killing them — or at least killing their chance at happiness.

Tristan Harris, a Design Ethicist at Google, explains that this is exactly the type of exploits magicians use to delight an audience. They look for “blind spots, edges, vulnerabilities and limits of people’s perception so they can influence what people do without them even realizing it.”

Harris says:

Once you know how to push people’s buttons, you can play them like a piano … This is exactly what magicians do. They give people the illusion of free choice while architecting the menu so that they win, no matter what you choose.

I can’t emphasize enough how deep this insight is.

Put simply, nothing on Facebook happens by accident. From the way menu systems are laid out to the illusion of choice when tagging friends in photos, nothing happens by chance.

You may believe these are conscious decisions. You control how often you pick up your phone (about 150 times a day, according to this study), and you may convince yourself that it’s a habit bred of boredom or curiosity. The truth is much darker than that. You’re an unwitting pawn in a game played between Facebook and its advertisers. As Harris says, you have a slot machine in your pocket waiting to reward each “spin” with subtle rewards. Only these rewards aren’t monetary, derived of physical pleasure, or chemical substances — although they reward your brain’s pleasure centers much in the same way.

Facebook is the new cigarette, but worse

As online babysitters go, Facebook is a pretty terrible one, all things considered. Yet thousands, perhaps millions, of parents willfully allow their child access to the social network even though the terms of service expressly prohibits lying about age to create an account.

A 2011 study detailed that 76 percent of the parents surveyed (of 1,007 households) allowed their children to create an account while younger than 13. Of them, 53 percent said they were aware Facebook had a minimum sign-up age and 35 percent believed the minimum age to be merely a suggestion. This raises real questions about the shortcomings of federal law, but also whether social networks themselves should shoulder more of the blame for not creating this sort of awareness — or doing more to actively enforce their own rules.

Maybe it doesn’t matter though. If Messenger Kids is any indication, the social network has plans for your children that disregard parental consent entirely.

And for a social network that’s proven to be a destructive force for adults, it’s hard to understand how we aren’t scrutinizing the decision to allow children to join before they fully understand what’s at stake. If Facebook and social media are responsible for our growing discontentment, imagine the consequences for a generation that never knew a world without it.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top