Yesterday, Instagram announced that it will start demoting content that doesn’t violate its Community Guidelines, but might be ‘inappropriate.’ The announcement says that posts that are ‘sexually suggestive’, ‘hurtful’, or ‘violent’ can be removed from discovery sections and hashtag pages.
This means that that dank meme you posted, as well as that raunchy picture of yourself, can get fewer eyeballs if Instagram is too prudish to recommend the post to a wider audience. It’ll be viewable on your profile, but it may not surface as readily as a ‘safe’ post for your followers.
The photo-sharing company notes on its Help Center page that the ‘inappropriate content’ will only show up on your feed if you follow the account:
While some posts on Instagram may not go against our Community Guidelines, they might not be appropriate for our global community, and we’ll limit those types of posts from being recommended on Explore and hashtag pages. For example, a sexually suggestive post will still appear in Feed if you follow the account that posts it, but this type of content may not appear for the broader community in Explore and hashtag pages.
It’s understandable that a privately owned company may want to portray its platform as wholesome and free of content that’s not appropriate for all audiences. But Instagram hasn’t issued clear guidelines to help creators understand whether any of their posts will be demoted. Plus, there’s no way to tell as of now if the service is demoting your posts and limiting their reach. That could hurt those who run businesses, and have large audiences as influencers on Instagram. We’ve asked the company for details, and we’ll update the post when there’s a response.
Instagram’s product lead for Discovery, Will Ruben, told TechCrunch that it’s using a combination of machine learning and trained content moderators to label borderline content.
The app has a long-standing policy against nudity. It states that nudity is only allowed if the photos show a nude sculptor or a painting, post-mastectomy scarring, or a woman breastfeeding.
In 2014, there was a huge uproar when it banned Rihanna’s topless photo. Even after several people posted their photos under the #FreeTheNipple movement, Instagram was unmoved.
“Even when this content is shared with good intentions, it could be used by others in unanticipated ways,” it said in its community guidelines. Despite all its efforts, several reports have suggested that the company hasn’t been successful in removing pornographic content.
The new policy will also demote memes that are offensive, as well as posts that contain misinformation. This comes after The Atlantic’s report published last month, noting that the platform is plagued with conspiracy theories and hateful content.
While this might be a progressive step towards keeping hateful content in check, Instagram will have to make sure that it doesn’t wrongfully demote content of creators who depend on it to earn money or grow their personal brands.
TNW Conference 2019 is coming! Check out our glorious new location, an inspiring lineup of speakers and activities, and how to be a part of this annual tech bonanza by clicking here.
Get the TNW newsletter
Get the most important tech news in your inbox each week.