Trolling seems to be the order of the day for many people, however, Instagram is having none of the trolling, things are about to change in Instagram.
The social media platform will no longer be a place for trolling as they have introduced the report comments feature to restrict ‘abusive content’ or ‘spam or scam.’
Now, Instagram appears to be developing more in-depth ways for users to report comments.
Jane Manchun Wong, an app researcher based in Hong Kong, has revealed that Instagram is working on a new Comment Reporting feature, similar to Facebook.
The new feature gives users a much wider range of options for why they’re reporting a comment.
New options include ‘nudity or pornography’, ‘hate speech or symbols’, ‘spam’, ‘violence or threat of violence’, ‘sale or promotion of firearms’, ‘sale or promotion of drugs’, ‘harassment or bullying’, ‘intellectual property violation’, ‘self injury’, or simply ‘I just don’t like it.’
It remains unclear when Instagram plans to roll the feature out to users.
This isn’t the first time that Instagram has introduced measures to limit bullying on the platform.
Back in October, the platform launched a range of anti-bullying features, including comment filters on live videos, kindness camera effects and AI systems to automatically detect bullying in photos.
In a blog announcing the update, Adam Mosseri, the new Head of Instagram , said: “There is no place for bullying on Instagram.
“If people see that kind of hurtful behaviour on our platform, they can report it and we remove it.
But online bullying is complex, and we know we have more work to do to further limit bullying and spread kindness on Instagram.”