Meta Platforms, the parent company of social media giants Instagram, Facebook and Whatsapp, is rolling out important changes aimed at enhancing the safety and well-being of teen users. The move comes as the company faces lawsuits and scrutiny, with over 40 U.S. states alleging that the platform misled the public about the harmful effects of its services on young people.
In an official blog post, Meta revealed its commitment to providing safe and age-appropriate experiences for teens on its apps. The company has developed over 30 tools and resources over the years to support teens and their parents, with a particular focus on addressing content that may break platform rules or be deemed sensitive.
One major announcement is the introduction of new content policies tailored to the types of content teens encounter on Instagram and Facebook. Meta acknowledged the importance of consulting experts in adolescent development, psychology, and mental health to create a safer online environment for young users.
Another key change involves the removal of content discussing sensitive topics, such as self-harm, from teens’ experiences on Instagram and Facebook. Recognising the importance of addressing such issues, Meta seems to understand the complexity of these topics and their potential unsuitability for all young audiences. The company will no longer display this type of content in teens’ Feeds and Stories, even if shared by accounts they follow.
To further support teens, Meta will continue to share resources from expert organisations, such as the National Alliance on Mental Illness, when users post content related to struggles with self-harm or eating disorders. These changes are gradually rolling out to teens under 18 and are expected to be fully implemented on Instagram and Facebook in the coming months.
Meta is also implementing updates to content recommendation settings for teens. All teens will now be automatically placed in the most restrictive content control settings on both Instagram and Facebook. These controls aim to make it more challenging for teens to come across potentially sensitive content or accounts in features like Search and Explore.
The company is taking steps to hide more results in Instagram searches related to suicide, self-harm, and eating disorders. While Meta allows users to share content discussing their struggles with these issues, it will now actively hide such results and guide users to expert resources for help.
In addition, Meta is also prompting teens to easily update their privacy settings. Notifications will encourage teens to adopt more private settings with a single tap, restricting who can repost their content, tag or mention them, or include their content in features like Reels Remixes.