YouTube has published new policies for its video platform that state it won’t allow content depicting dangerous pranks and stunts like the Bird Box and Tide Pod challenges. Good.
The company wants to prevent creators from engaging in these activities and from encouraging viewers to participate. Its updated Community Guidelines address the issue as follows:
Content that encourages violence or dangerous activities that may result in serious physical harm, distress or death violates our harmful and dangerous policy, so we’re clarifying what this means for dangerous challenges and pranks. YouTube is home to many beloved viral challenges and pranks, but we need to make sure what’s funny doesn’t cross the line into also being harmful or dangerous. We’ve updated our external guidelines to make it clear that we prohibit challenges presenting a risk of serious danger or death, and pranks that make victims believe they’re in serious physical danger, or cause children to experience severe emotional distress.
YouTube’s Dangerous Challenges and Pranks Enforcement FAQ explains that it’s okay to share clips of people flipping water bottles and of parents giving their kids awful Christmas presents. However, videos of home invasion pranks, and of people tricking children into believing their parents have died won’t be tolerated on the site.
The company really shouldn’t have to spell this out – but people can be stupid, inconsiderate and downright cruel. Uploading prohibited content will earn you a 90-day strike, which will remove privileges like live streaming. Rack up three strikes within that time, and you’re off the platform.
YouTube also noted that it won’t allow thumbnails depicting pornography or graphic violence – a problem that BuzzFeed News’ Davey Alba highlighted today in her story about images of graphic bestiality in search results.
These aren’t easy problems to fix, even with the help of AI. Unfortunately, artificial intelligence systems can’t catch every video that violates platform guidelines, and that means humans have to moderate content manually, or in other words, look at a lot of depraved stuff to keep users safe.
The trauma these moderators face has been documented over the years, and things haven’t got much better for them. Last March, YouTube said it limited moderators’ shifts to four hours a day in a bid to protect their mental health. That still amounts to four hours of looking at videos you’d never want to see.
One can only naively hope that the updated guidelines will help keep reduce the amount of filth that viewers and moderators will encounter on the platform.
Get the TNW newsletter
Get the most important tech news in your inbox each week.