Sharing is caring

Instagram is experimenting with a new nudity filter that would block likely nudes in users’ Instagram Direct messages, it is yet another way to shield users from seeing unwanted content.

The new “nudity protection” option would enable Instagram to activate the nudity detection element in iOS, which scans incoming and outgoing messages on your device to detect potential nudes in attached images.

This feature overview was discovered by app researcher Alessandro Paluzzi, as you can see in the tweet given below.

Instagram notes that whenever it is detected the system will automatically blur the image. This means that Instagram and parent company Meta would be downloading and analyzing it.

Also Read: Snapchat rolls out new fresh Features for Fall!

Of course, the fact that iOS is scrutinizing your messages and filtering them based on their content still raises some questions.

Instagram has made an effort to reassure users, however, it’s not downloading the actual images either, and that all of this is done through machine learning and data matching, which doesn’t trace or track the specifics of your individual interactions.

However, Instagram would be keeping the track of how many images it detects and blurs using this process, which can also indicate that it has data on how frequently nudes are sent to you.

In any case, it could be another crucial step for Instagram, which has been working to increase protection for younger users, and the potential safety value may outweigh any such worries.

In response, Meta has also launched a number of new safety tools and features, such as updated in-app “nudges” and “Take a Break” reminders, which are designed to steer users away from potentially harmful topics.

Additionally, it has increased the default settings for sensitive content for young users, placing all account holders under the age of 16 in the category with the strictest exposure controls.