Instagram launching comment control and follower removal tools to curb abuse
Instagram is shoring up its defenses against abuse with the launch of tools people can use to restrict who reacts to their photos and videos. The photo-sharing company announced on Tuesday that it now supports the ability to restrict who comments, removing people from private accounts, and a way to anonymously report those you might think are posting about injuring themselves.
The release of these new features are said to be part of Instagram’s efforts to make the service a “positive place for self-expression” — basically not have it be a troll-infested environment similar to Reddit and Twitter.
With the new commenting control tool, if you find that someone has posted something that’s mean-spirited message on a photo or video you’ve shared, you don’t have to just sit there and take it. On top of the ability to filter out comments by keyword, you can now disable them from entire posts. This ability was only available for “a small number of accounts,” but “in a few weeks” everyone can use this. Prior to posting, tap the “advanced settings” option, choose “turn off commenting” and that’s it. Comments can be re-enabled by toggling this option.
In addition, soon Instagram will allow its more than 500 million monthly active users to heart individual comments, not just the photo or video itself. It hopes that by bringing liking to the commenting level, it will “show support and encourages positivity throughout the community.”
For those of you with a private account, Instagram now lets you remove followers without blocking them. Why this hasn’t been done now is unknown, but if you let someone see your posts, there previously wasn’t a way for you to get rid of them without resorting to blocking the account. Now, you can remove them by going into your list of followers, tapping the … menu next to their name. When this action is taken, no notification will be sent that lets them know you don’t want to be friends.
Lastly is a feature social networks need to have in place, a way for people to let others know someone is threatening self-injury. Instagram chief executive Kevin Systrom explained: “From time to time, you may see friends struggling and in need of support.” So if you happen to see a post from someone that suggests self-harm, the service now lets you report it anonymously. There’s a team working 24 hours a day, 7 days a week around the world to review the reports and will connect that person with organizations that can help.
For many, Instagram has been a place where people not only share what they’re seeing, but also heir art. It has blown up into a community, but with more people on it, the danger of harassment and abuse increases. No more so if you post content around politics, national and world issues, and social progress. We’ve seen it on Twitter, Facebook, Reddit, and many other sites. Today’s release is an attempt to likely stem any potential offenses that may be made. Right now, Instagram hasn’t been in the news about this, unlike its counterparts, but with such a large audience, that could soon change.
I, for one, would appreciate a crackdown on trolls and spammers on Instagram. Nary a day goes by when I may post something along with hashtags that spur “commenters” to post about how to get new followers or tag me in random comments. This results in me having to either ignore it, or spend extra seconds to block and report the offending account.
Systrom promised that these features are only the beginning and more could be released in the future.
Comments
Post a Comment
THANK YOU.......