This article was published on January 17, 2018

YouTube’s decision to make it harder to monetize videos is hurting the community


YouTube’s decision to make it harder to monetize videos is hurting the community

YouTube is increasing the requirements for channels on its platform that will make them eligible to earn money from ads run before and during their videos.

In April 2017, it began requiring channels to have a minimum of 10,000 lifetime views to qualify for its monetization program; it’s now upped that to a threshold of 4,000 hours of watchtime within the past 12 months, and 1,000 subscribers. The company explained its revamped criteria in a blog post:

They will allow us to significantly improve our ability to identify creators who contribute positively to the community and help drive more ad revenue to them (and away from bad actors). These higher standards will also help us prevent potentially inappropriate videos from monetizing which can hurt revenue for everyone.

That’s bad news for smaller channels who may not command a large audience but still play by YouTube’s rules and earn through their videos.

Anurag Shanker, a Mumbai, India-based composer and music producer who runs a handful of YouTube channels, explained that the new requirements pose a difficult challenge to upcoming creators:

Previously, it was possible to earn at least enough to cover the cost of your own DIY video projects over time. The gap between YouTube’s earlier requirements and the new ones is massive. Garnering 4,000 hours of watch time is a whole different ball game than trying to build an audience organically without specializing in video production and publishing. For myself and my colleagues, that means shelving some upcoming projects, because we’ll now need to find other ways to fund them.

The <3 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

But this is about YouTube looking out for its own interests: its platform played host to plenty of disturbing content last year, including videos depicting violent imagery featuring beloved children’s cartoon characters. It also lost millions of dollars in revenue as numerous major brands boycotted YouTube for running their ads alongside racist and homophobic content, as well on clips that attracted “comments from hundreds of pedophiles.”

It hardly seems like the best approach to fixing what’s broken at YouTube, though. Sure, it’s a massive platform with hundreds of hours of video being uploaded every minute, and so it can’t be easy to police as effectively as we’d like. But the truth is that this approach punishes smaller creators, while allowing some offenders to slip through the cracks.

Last February, Swedish streamer PewDiePie had more than 53 million subscribers when he published a video showing two shirtless men laughing as they held up a banner that read, “Death to All Jews.” This year, Logan Paul, another YouTuber popular with younger audiences, posted a clip with the camera trained on the body of a suicide victim in Japan. In those instances, it was the creators that took down the videos, not YouTube. And the company also didn’t demonetize or ban DaddyOFive, a channel that frequently filmed and published videos depicting acts of child abuse.

Ultimately, this move could help YouTube avoid coming under fire for having its ads run on videos from malicious actors – but it also hurts its community and doesn’t address the real concerns surrounding an open-to-all platform. The company needs to go back to the drawing board to figure out smarter ways of identifying and dealing with troubling content. Where’s AI when you need it most?

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top