In response to growing concerns over user safety in online spaces, popular image-sharing social media platform Instagram is bringing new changes and updates that are aimed at enhancing security and combating online exploitation, especially that of underage and teenage users. These measures come amidst mounting criticism of the platform’s handling of harmful content, particularly its impact on vulnerable demographics such as children and teenagers.

According to reports, the implementation of these new features will occur gradually, while the testing is set to commence shortly prior to a broader global rollout in the coming months.

A notable addition to Instagram’s arsenal of safety features is the introduction of nudity protection in private messages. Leveraging machine learning algorithms, this feature automatically detects and blurs images containing nudity, preserving user privacy while mitigating the risk of exposure to inappropriate content. Meta, the parent company of Instagram, notes that this tool will be enabled by default for users under 18, with the option for others to opt in voluntarily. In the event of receiving a blurred image, users will also receive prompts and options to block the sender, thereby adding an additional layer of protection, and if they try to forward it to someone else, then Instagram will send them a message encouraging them to reconsider.

“While people overwhelmingly use DMs to share what they love with their friends, family or favorite creators, sextortion scammers may also use private messages to share or ask for intimate images. To help address this, we’ll soon start testing our new nudity protection feature in Instagram DMs, which blurs images detected as containing nudity and encourages people to think twice before sending nude images. This feature is designed not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images in return,” Meta noted in an official blog post on the matter.

In addition to nudity protection, Instagram is rolling out measures to detect and deter potential scammers and perpetrators of sextortion—digital blackmail involving intimate media. Suspicious message requests will now be routed to hidden folders, reducing the likelihood of users encountering harmful content. Moreover, individuals engaged in conversations with such accounts will receive warnings and guidance on setting boundaries and reporting abusive behavior.

“One way we’re doing this is by making it even harder for potential sextortion accounts to message or interact with people. Now, any message requests potential sextortion accounts try to send will go straight to the recipient’s hidden requests folder, meaning they won’t be notified of the message and never have to see it. For those who are already chatting to potential scam or sextortion accounts, we show Safety Notices encouraging them to report any threats to share their private images, and reminding them that they can say no to anything that makes them feel uncomfortable,” Meta noted.

Furthermore, Instagram is focusing on other initiatives and support resources to empower users and promote responsible online behavior. Pop-up messages will inform users about the potential risks associated with sharing sensitive content, including nude images, provide safety tips, and remind them of available options such as message deletion and reporting tools. Additionally, resources and support networks will be readily accessible to individuals who may have interacted with accounts involved in sextortion scams, offering guidance and assistance in navigating these challenging situations. And to add to this, the social media company is bringing new child safety helplines into its in-app reporting flows.

If all goes well, this will have a positive impact on users in a few key ways. Firstly, it will create a safer online environment, particularly for younger users. Nudity protection measures like blurring potentially inappropriate messages can help shield them from unwanted exposure. Anti-sextortion tools can make it harder for predators to exploit and manipulate them. Furthermore, users will be able to identify and avoid scams or manipulative tactics. Support resources can provide a lifeline for those facing harassment or abuse as well.