Company Announces They Will Reduce Reach of Posts that Are Borderline Violations of Community Guidelines
It’s no secret that Instagram is home to many scantily clad models, memes in poor taste, footage that requires some viewer discretion, and outright spam. That could be changing, though, following a recent announcement from Facebook.
Facebook, Instagram’s parent company, announced that they’re going to be taking a harder line against content that comes close to violating community guidelines by limiting its reach.
“We have begun reducing the spread of posts that are inappropriate but do not go against Instagram’s Community Guidelines, limiting those types of posts from being recommended on our Explore and hashtag pages.”
What does this mean? In short, while Instagram won’t be directly banning or cutting back on content that seems to push the envelope, they will limit how discoverable it is in their search functions.
Instagram provided an example of sexually suggestive posts. If a user follows the account that posts this content, they’ll still see it, but it might not appear in their “Explore” or hashtag pages.
This begs the question: what is inappropriate content?
While they did provide an example of “sexually suggestive” content, it’s still not totally sure what that will mean. “Borderline content” could encompass a much wider variety of content than initially thought, and there are apparently no guidelines on what constitutes “borderline content” in Instagram’s documentation.
A report from TechCrunch offers a few more specifics, diving into the details around this, but also notes that users are still mostly in the dark as to what constitutes borderline inappropriate content.
Generally, it seems as though Instagram will be keeping content that is violent, graphic, shocking, or sexually suggestive out of its recommendations.