Meta, the parent company of Instagram, has rolled out new safety measures aimed at protecting teenagers on its platform, making accounts private by default for users under 18 and implementing tighter restrictions on sensitive content.
While these measures address growing concerns over teens’ mental health and online safety, many critics argue that they fall short of addressing the deeper issues teens face, such as body image anxiety and harmful online influences.
Meta’s Teen Safety Features for Instagram
Instagram has become a central part of many teens’ lives, but this pervasive influence has led to increased scrutiny regarding its effects on mental health, particularly body image issues.
According to Meta’s latest update, teens will now automatically have private accounts, and sensitive content—including posts related to violence or cosmetic surgery—will be limited. These measures aim to protect young users from potentially harmful content that could contribute to mental health struggles, like anxiety and depression.
The new Instagram teen account rules also include enhanced parental controls. Parents will be able to:
- Set daily time limits on the app,
- Block usage during nighttime hours,
- Monitor who their children are messaging,
- Review the categories of content their children are viewing.
Despite these changes, some argue that they don’t go far enough in addressing the root causes of teens’ mental health struggles, particularly when it comes to how body image issues are perpetuated through social media.
What Do Experts Say About Meta’s Teen Safety Features?
Experts and parents alike have welcomed these changes, but many feel that more needs to be done to protect teens. Laura Morton, a filmmaker who directed the documentary Anxious Nation, highlighted the lasting effects that social media exposure can have on young minds. Her daughter Sevey Morton, who has experienced body image issues as a result of Instagram’s airbrushed standards, said that while the changes are a good start, they aren’t a complete solution.
“Being exposed to that at a very young age impacted the way I grew into myself,” Sevey shared. “There is a huge part of me that wishes social media did not exist.”
While Sevey’s experience is far from unique, it underscores a broader concern among parents and mental health professionals: online child safety measures must go beyond just setting limits on time and content. Sensitive content restrictions might shield teens from some harmful material, but they don’t address the constant barrage of unrealistic beauty standards promoted by influencers.
Furthermore, while Meta’s teen safety features give parents more control, experts caution that parental monitoring cannot replace the need for more stringent platform-wide changes. For instance, some have suggested more aggressive action against accounts promoting potentially harmful content, such as influencers selling weight loss products or cosmetic enhancements.
According to Dr. Amanda Lenhart, a researcher specializing in teens and social media, “While Meta’s new rules are a step in the right direction, they do little to combat the broader culture of comparison and validation-seeking behavior that social media encourages.”
What Does the Future Hold for Teen Online Safety?
Looking forward, there is hope that Meta’s efforts could prompt other social media platforms to take teen safety more seriously. However, critics argue that tech companies like Meta need to implement more comprehensive strategies, focusing on how social media itself fosters an environment that may harm teen mental health.
As teens continue to grapple with body image issues and anxiety driven by online interactions, many call for stronger regulations. This includes more transparent moderation practices, enhanced parental controls on Instagram, and the development of tools that actively discourage toxic behavior online.
Meta has not ruled out further changes. In a statement, the company acknowledged that while recent updates are significant, they are not the end of the road: “We are committed to improving our platforms and working with parents, teens, and mental health experts to ensure the safety of young people online.”
Is It Enough?
While Meta’s new features, such as private accounts for teens and sensitive content restriction, are a positive step, many feel it is only a partial solution to a complex problem. As body image issues, cyberbullying, and mental health concerns persist among teens, the question remains: Are these changes enough, or should we expect more from social media giants like Meta?
What do you think about Meta’s new teen safety features? Let us know your thoughts in the comments.