Instagram is just going to block offensive comments now before you ever see them.
Or try to, anyway.
As part of its commitment to "foster kind, inclusive communities" (aka not have a hellhole of a comments section), Instagram is rolling out two new language filters for English users.
The first is just to block "certain offensive comments" on posts and live video. Every other comment will still show up as normal, and ones that get through you can report/delete.
You can turn this filter off by going into your settings, scrolling down to the "Comment" portion.
The second filter specifically targets "obvious spam" comments, and will block those out automatically as well.
How this is different from the current 'inappropriate' filter
Instagram had an option to hide "inappropriate" comments before. It was a feature they rolled out in 2016 and would block replies that contained words from a default "inappropriate" list. You could also add your own words to be blocked.
The new filters announced this week are going to learn.
"Our team has been training our systems for some time to recognize certain types of offensive and spammy comments so you never have to see them," Instagram said in the announcement. "The tools will improve over time, enabling the community’s experience of sharing to improve as well."
The entire thing is done via machine learning, which Instagram calls an "important step" in keeping the app a kind, inclusive place. They say the "toxic" comments make users and businesses not want to post on the app as much.
As the algorithm improves, the new filters could be available in more languages.
WIRED has a story about the Facebook-developed system behind it, called Deep Text.
Facebook – which owns Instagram – this week talked about some of the big challenges with filtering out hate speech.
One of their main conclusions? We are "a long way from being able to rely on machine learning and AI" to understand and assess hate speech.