Instagram has today announced a new feature that expands on its offensive comment filter implemented last year – a bullying comment filter.

Like the former, the new feature will automatically filter out toxic and divisive comments, especially those targeted to at-risk groups. While the Facebook-owned company hasn’t specified exactly how the filter works, it does say targeted users will never have to see any hurtful comments.

If a comment contains attacks on a person’s appearance or character, or threatens that person’s health or wellbeing, it’ll be automatically hidden. If the comments continue to be made, an alert will be sent to Instagram so they can deal with the issue.

The platform is also expanding its policies to protect young public figures from bullying. Once it’s live, the feature can be toggled on and off via the settings menu as per the image below.

Instagram’s New Feature Will Automatically Filter Out Toxic Comments

The update is expected to roll out today to all users. It’s always nice to see social media platforms across the board do more to combat the hateful shit being hurled around them. Kudos to Instagram for this one.

Image: Getty Images