Save over 40% when you secure your tickets today to TNW Conference 💥 Prices will increase on November 22 →

This article was published on April 3, 2019

Bloomberg report explains why YouTube hosts so much toxic content

It's the metrics, stupid.


Bloomberg report explains why YouTube hosts so much toxic content

Yesterday, Bloomberg’s Mark Bergen published a devastating piece charting how the once-innocuous YouTube became the world’s most effective radicalization machine, in the process becoming a haven for conspiracy theorists and actual honest-to-god paedophiles.

The reason is… well, it’s not even remotely surprising. In short, YouTube sacrificed good governance and responsible stewardship at the altar of growth.

According to the report (which is summarised nicely here, if you’re short on time) , YouTube’s CEO Susan Wojcicki and other company executives established ambitious goals for growth and engagement. Indeed, Wojcicki’s tenure began with an ambitious target: one billion daily hours of watch time. Since then, the company has been driven by metrics, rather than what’s best for society and its users.

A cornerstone of the company’s growth strategy involves the controversial recommendations engine, which seeks to keep users on the site by showing them videos deemed relevant to their interests. Bergen states that one YouTube privacy engineer requested that videos regarded as “close to the line” (by that, veering on inappropriate, but without necessarily falling afoul of the rules) be removed from recommendations. That request was ignored by the employee’s superiors.

Another example of the company’s irresponsible growth strategy is highlighted in “Project Bean.” This was a proposed overhaul of the platform that would see creators rewarded not on traditional metrics (ad clicks and impressions) but rather engagement. Here’s how it would have worked, according to Bloomberg:

It centered on a way to pay creators that isn’t based on the ads their videos hosted. Instead, YouTube would pay on engagement—how many viewers watched a video and how long they watched. A special algorithm would pool incoming cash, then divvy it out to creators, even if no ads ran on their videos. The idea was to reward video stars shorted by the system, such as those making sex education and music videos, which marquee advertisers found too risqué to endorse. 

Although it would have benefited legitimate creators who are largely ignored by risk-averse advertisers, it would have also helped reward the most sensational and outrageous creators, like those suggesting vaccines cause autism, or school shootings are a shadowy government conspiracy. Ultimately, it would have likely exacerbated the endemic problem of fake news and extremism on the platform. Thankfully, Google CEO Sundar Pichai ultimately rejected the idea, but not before YouTube engineers spent over a year working on it.

Bergen’s piece is arguably the most important critique of YouTube published in recent years. It’s dispassionate and methodical. It lays bare the failures of the site’s management in moderating it and responding proactively to problems. Worse, it suggests that the site’s management were oblivious to the position YouTube holds in our media discourse. The word “damning” is banded about a lot, but trust me when I say this piece is worthy of the term.

Will anything come of it? Probably not. YouTube is YouTube. When you reach a certain scale, you become invulnerable to most external forces. That includes pesky journalists. However, the article does an excellent job of mapping YouTube’s countless mistakes. Perhaps the next CEO can read it and avoid them.


TNW Conference 2019 is coming! Check out our glorious new location, an inspiring line-up of speakers and activities, and how to be a part of this annual tech bonanza by clicking here.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with