Social media platform Instagram, owned by Meta, is now giving parents more control over their teen’s messaging and content settings, as well as making its platform safer for underage users. This move comes at a time when the platform faces mounting pressure from regulators and child advocacy groups to do more to protect minors online.
One of the most impactful changes in Instagram’s new framework is the automatic setting of accounts belonging to users under 16 to private. This ensures that only approved followers can view the teen’s posts, drastically reducing the chances of inappropriate contact from strangers. Meta will also leverage AI to determine the safety and security of its users – AI will be used to to detect when users misrepresent their age, ensuring that teens are placed into accounts with appropriate protections, as well as analyze user interactions and flag accounts that are likely operated by minors.
In addition to restricting account visibility, Instagram is rolling out features that will limit interactions between teens and people they do not follow. This includes restrictions on tagging, messaging, and mentioning accounts in posts. Teens will now only be able to interact with people they are connected to, ensuring a safer social environment. The changes will not roll out immediately, and will first take effect for users in the US, UK, Canada, and Australia within 60 days, with a broader expansion planned for the European Union later this year. Instagram’s global teen user base is expected to be fully transitioned into the new system by 2025.
Another critical focus of Instagram’s update is minimizing the exposure of teens to harmful or sensitive content. The app’s algorithm will automatically filter out posts promoting violence, self-harm, eating disorders, or cosmetic procedures—topics that have historically been linked to negative mental health outcomes for teens. Additionally, Instagram is implementing more restrictive content settings for teen accounts, making it more difficult for inappropriate or harmful content to appear in a young user’s feed, particularly on the Explore and Reels pages.
Instagram’s “Hidden Words” feature will also automatically filter offensive language and toxic phrases from the comments and messages that teens receive, making it harder for online bullying or harassment to occur. In addition to this, Instagram is introducing a suite of new parental control tools. These allow parents to oversee their teen’s account in real-time, including being able to see who their child has messaged recently, though the actual content of the messages will remain private. This feature gives parents the ability to step in if they see concerning interactions while respecting their child’s privacy to a degree.
Screen time has been a growing concern for parents, medical professionals, and policymakers in recent years. Instagram has heard their concerns, and is doing something about it. Parents will also have the power to set limits on how much time their teens can spend on the app, as well as implement restrictions on usage during certain hours. This new level of control addresses one of the most persistent worries associated with social media: excessive screen time. With features such as “sleep mode,” parents can ensure that their teens are not using the app between 10 p.m. and 7 a.m., encouraging healthier habits and reducing the potential negative impact of social media on mental health.