YouTube will start using AI to automatically add age restrictions to videos “over the coming months,” the company announced on Tuesday.
The site currently relies on content reviewers to flag videos that aren’t appropriate for viewers under 18, but will soon start using machine learning to detect content for review. Uploaders will be able to appeal the system’s decisions.
In addition, viewers who try to evade the restrictions by watching videos embedded on third-party websites will be redirected to YouTube, where they’ll have to sign-in to show they’re over 18.
“This will help ensure that, no matter where a video is discovered, it will only be viewable by the appropriate audience,” the company said in a blogpost.
[Read: Are EVs too expensive? Here are 5 common myths, debunked]
The new restrictions are YouTube‘s latest response to concerns over the protections it provides to young users. The site has been engulfed by a series of scandals involving videos of child abuse, and last year was fined $170m for collecting kid’s personal data without their parents’ consent.
Last August, the company launched a web version of the YouTube Kids mobile app for children under 13, which offers curated content, parental control features, and filtering of videos.
The new automated restrictions will detect content on the main platform deemed inappropriate for anyone under 18.
YouTube‘s recent experiments with content moderation suggest that the automated system will lead to far more videos being age-restricted.
In the second quarter of this year, YouTube took down a record number of videos after increasing the role of AI in its content review efforts. Neal Mohan, the company’s chief product officer, told the Financial Times this week that the site recently returned to using more human moderators. But for age restrictions, the company clearly feels that more AI is required.
So you’re interested in AI? Then join our online event, TNW2020, where you’ll hear how artificial intelligence is transforming industries and businesses.