Social media giant Meta is now expanding its “Teen Accounts” feature to Facebook and Messenger. Originally launched on Instagram in September last year, Teen Accounts are designed to automatically enforce privacy settings, content restrictions, and parental controls for users under 18. The roll-out will initially cover users in the US, UK, Australia, and Canada, with global availability expected to follow in the coming months.
Teen Accounts first arrived as Meta’s response to criticism over the role of social media in harming young users’ mental health and exposing them to inappropriate content or unwanted interactions. The feature automatically enrolls users aged 13 to 15 into a restricted app experience that minimizes their exposure to potentially harmful content while giving parents greater oversight. Once set up, Teen Accounts restrict interactions to friends or previously contacted users only — meaning strangers cannot send direct messages or comment on their content unless they already have a connection.
The company claims that early adoption metrics for Teen Accounts on Instagram are promising. The company claims to have migrated 54 million teens globally to the new restricted experience. Even more telling, 97% of teens aged 13-15 have retained their built-in protections without opting out. The enterprise also commissioned a study by research firm Ipsos, which found that 94% of surveyed parents believed Teen Accounts were helpful in supporting their parenting efforts online. Furthermore, 85% of parents felt these tools made it easier to foster positive online experiences for their children.
“Last year, we reimagined the Instagram experience for teens – and their parents – by introducing Instagram Teen Accounts. Teen Accounts have built-in protections that limit who can contact teens and the content they see. We automatically place teens into Teen Accounts, and teens under 16 need a parent’s permission to change any of these settings to be less strict. Since making these changes, 97% of teens aged 13-15 have stayed in these built-in restrictions, which we believe offer the most age-appropriate experience for younger teens,” Meta announced in an official statement.
“These are major updates that have fundamentally changed the experience for teens on Instagram. We’re encouraged by the progress, but our work to support parents and teens doesn’t stop here, so we’re announcing additional protections and expanding Teen Accounts to Facebook and Messenger to give parents more peace of mind across Meta apps,” the company added.
Alongside the Facebook and Messenger expansion, Meta is further enhancing protections on Instagram. Teens under 16 will now need parental approval to start a live broadcast on the platform, as well as to disable the “nudity protection” feature, which automatically blurs images that are suspected of containing nudity in direct messages (a reasonable safeguard that can further protect underage users on the popular image-sharing platform). Furthermore, in a nod to growing concerns over social media addiction and its impact on mental health, Meta has also introduced features that encourage teens to take breaks from their devices. These include daily usage reminders after an hour of app usage, as well as the automatic activation of “Quiet Mode” during nighttime hours, muting notifications and discouraging late-night scrolling.