Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on December 23, 2018

The internet is toxic because humans are toxic

We can’t keep blaming human behavior on the robots


The internet is toxic because humans are toxic

When in doubt, blame the robots. As Facebook has fallen from grace and struggled to reconcile its role in spreading propaganda and stoking political anger, the company has proposed a familiar solution:

If the algorithm has failed, let’s just build a better algorithm.

It’s a noble goal for the next hackathon. As a mechanism for real change, however, the focus on software misses the point.

Facebook’s problems can’t be solved with more data or better code. They’re simply the most potent and alarming example of the fact that the internet has failed as a public forum.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

Not long ago, the scientists and software developers who pioneered the World Wide Web thought it would democratize publishing and usher in a more open, educated, and thoughtful chapter of history.

But while the internet and its offshoot technologies have improved society and daily life in many ways, they have been an unmitigated disaster for the way people communicate and learn.

It feels good to blame Facebook, but the crisis is evident in every nook and cranny of the web. The internet is crawling with normal, everyday humans who transform into vicious, nihilistic psychopaths the moment they’re granted even a thin veil of anonymity in a comment thread. 

This was the nature of online communication in 1995, when astronomer and early-adopter Clifford Stoll lamented in Newsweek:

The cacophony more closely resembles citizens band radio, complete with handles, harassment, and anonymous threats. When most everyone shouts, few listen.

His words still ring true 23 years later.

Despite Facebook’s efforts to make its platform less anonymous than its predecessors, it is painfully clear that people seem to have no problem piling on the hate when tapping away on a keyboard or phone, comfortably distant from the consequences of their words for the real person reading them many fiber-optic cables away.

Facebook, along with most of its social media counterparts, operates in part based on the Silicon Valley hypothesis that if all ideas are distributed freely, the most valuable ones will rise to the top.

Instead of validating that seemingly uncontroversial point, however, our collective experiences online have proven it wrong. The marketplace of ideas has become an environment with no barriers to transmitting vitriol to millions, largely negating the internet’s egalitarian, utopian goals.

“I thought once everybody could speak freely and exchange information and ideas, the world is automatically going to be a better place,” Evan Williams, a founder of Twitter, Blogger, and Medium, said in a 2017 interview with the New York Times. “I was wrong about that.”

Perhaps we wouldn’t still be defending this failed experiment if it wasn’t so stunningly profitable. Facebook didn’t invent advertising, but it has scaled the many flaws of that business model to an unprecedented magnitude.

The runaway financial success of highly targeted pay-per-view ads has warped software design into a competition to build the most addictive digital slot machine, masquerading as social engagement.

The longer we stay hooked, the more advertising we see, and the more eyeballs can be sold for a fraction of a penny apiece. This is why you get an email if you don’t sign in for a few days.

This is why your apps throttle your notifications, slowly distributing the likes on your vacation photos over the course of minutes rather than seconds. It’s an elaborate manipulation to keep you coming back for more.

It’s also why Jeffrey Hammerbacher, a former Facebook engineer, pined for something more in a 2011 interview with Businessweek:

“The best minds of my generation are thinking about how to make people click ads…That sucks.

Like Facebook’s other innovations, the advertising platform is valuable for individuals but disastrous at scale. The company’s wealth of demographic and behavioral data helps small business owners find their niche audience, but it is also a gold mine for amateur propagandists.

The platform earns money from engagement—and guess which ads and articles are the most engaging? It’s not the calm, thoughtful, balanced ones.

And that brings us to the news. Veteran journalists spent the web’s first decade tentatively dipping their toes in the new technology, rightfully skeptical of its supposed virtues.

They felt the sting of getting scooped by fly-by-night blogs with low budgets and lower standards. Then, slowly but surely, they hopped on board. By the time Facebook became a significant source for news, the major publications knew better than to risk being left behind.

The rush of traffic from Facebook has been beneficial for the publications’ bottom lines. However, that has been accompanied by an avalanche of meaningless and disorienting social feedback—a thoughtless “like,” a forgotten “share”—that has pushed even the best publications toward clickbait and sensationalism.

There is no software that can force commenters to engage in respectful debate.

According to the analytics, that’s what people like. The result is that writers and editors have far less leeway to focus on what they believe is valuable because the ultimate arbiter of success and prestige is the fleeting gratification of a page view. More news, faster news and trendier news paved the road to victory.

The source of this problem can’t be found in a data center. The algorithm has failed because people are collectively seeking knowledge and human connection via the impersonal interface of the internet and then feeling angry and confused when they come up empty-handed.

There is no software that can force commenters to engage in respectful debate. There is no app to eliminate the immense conflicts of interest and perverse incentives of pay-per-view advertising sales.

There is no subroutine to stop news organizations from competing in a race to the editorial bottom, seduced by clickbait and lusting for attention at any cost.

No matter how well we code, no matter how convincingly we simulate and augment reality, our brains and bodies still know that what we experience on a screen is—in an important but ambiguous way—not real.

The people aren’t really there, so we hate them. The approval isn’t real, so it’s never truly satisfying. And if we manage to make a friend online, we can’t help but fantasize about how different it might be to meet IRL. (Yes, that’s an acronym for “in real life.”)

Tweak the algorithm all you want. It will never be a worthy substitute for a good book, a healthy debate, or an honest friendship. As long as we trust software to shape our interaction with the world, life will be a disappointing, chaotic, infinite scroll.

This article was published by Rob Howard, the author of Hiatus, the weekly current events briefing with no links, no likes, and no distractions. You can read the original article here

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top