Instagram’s New Feature Will Automatically Filter Out Toxic Comments

Instagram has today announced a new feature that expands on its offensive comment filter implemented last year – a bullying comment filter.

Like the former, the new feature will automatically filter out toxic and divisive comments, especially those targeted to at-risk groups. While the Facebook-owned company hasn’t specified exactly how the filter works, it does say targeted users will never have to see any hurtful comments.

If a comment contains attacks on a person’s appearance or character, or threatens that person’s health or wellbeing, it’ll be automatically hidden. If the comments continue to be made, an alert will be sent to Instagram so they can deal with the issue.

The platform is also expanding its policies to protect young public figures from bullying. Once it’s live, the feature can be toggled on and off via the settings menu as per the image below.

The update is expected to roll out today to all users. It’s always nice to see social media platforms across the board do more to combat the hateful shit being hurled around them. Kudos to Instagram for this one.

The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At PEDESTRIAN.TV, we independently choose and write about stuff we love and think you’ll froth too. We have affiliate partnerships so we might get a bit of money from any purchase you make based on our recs, cool? Cool. FYI – prices are accurate and items in stock at the time of posting.