Instagram has started taking steps to reduce the reach of posts which it feels are inappropriate, but do not breach the company’s guidelines completely. This means that in case a post is sexually suggestive but does not depict a sexual act or nudity, even then the post will get lower views.
Likewise, if a post does not meet the hate speech or harassment norms of the company, but is considered to be in bad taste, hurtful and violent, it will fetch lesser views.
These new changes come into effect along with a series of updates from Instagram’s owner Facebook on managing problematic content.
“We have begun reducing the spread of posts that are inappropriate but do not go against Instagram’s Community Guidelines, limiting those types of posts from being recommended on our Explore and hashtag pages.” Facebook said in a statement.
The company explains that a sexually suggestive post may still appear in the Feed if the user follows that particular account which posts it, however, this post or similar type of content will not appear for the other users in Explore or hashtag pages.
“We’ve started to use machine learning to determine if the actual media posted is eligible to be recommended to our community,” TechCrunch reported quoting Instagram’s product lead for Discovery, Will Ruben.
Instagram is training their content moderators to make them understand and label borderline content when they’re looking for posts which violate the company’s policy, and it will then use these labels to train an algorithm to identify, the report said.
“As content gets closer and closer to the line of our Community Standards at which point we’d remove it, it actually gets more and more engagement. It’s not something unique to Facebook but inherent in human nature,” the report said quoting Facebook’s Henry Silverman.
According to reports, Facebook will be shutting down its Facebook, Instagram and Messenger apps for Windows Phone by the end of this month.