Sharing is caring

We are all unsure of our safety and always searching for answers to these and other privacy and security-related questions. Different applications and different software are always working on the safety of its users and now Instagram is working on its way to create a platform safe for users by protecting them from unsolicited nude pictures.

 Instagram’s parent company, Meta, confirmed to The Verge that the feature was in development after an app researcher published an early image of the tool.

Meta says the optional user controls, which are still in the early stages of development, will help people shield themselves from nude photos as well as other unwanted messages.

The tech behemoth compared these controls to its “Hidden Words” function, which enables users to automatically screen direct message requests that contain objectionable language.

Also Read: YouTube Announces New ‘Creator Music’ Platform

Alessandro Paluzzi also tweeted about how the feature is in process for the users. Nudity protection is a mechanism on your device that covers photographs that might include nudity in chats, as described in Paluzzi’s preview. The business in the teaser also states that these pictures won’t be accessible.

According to Meta, the technology will not allow Meta to view the actual messages nor share them with third parties. “We’re working closely with experts to ensure these new features preserve people’s privacy while giving them control over the messages they receive,” Meta spokesperson Liz Fernandez said, reported to The Verge. Meta says it’ll share more details about the new feature in the next few weeks as they get closer to testing.

Here’s hoping that the problem with nudity will also be resolved, as well as the problem with harassing and other unsettling content over the platform.