Alerts Notify Parents When Teens Need Support

Instagram is implementing a new feature aimed at enhancing teen safety by notifying parents when their children search for terms related to suicide or self-harm. This initiative will begin in the coming weeks and is part of the platform’s broader parental supervision tools.
Details of the Notification System
The alerts will be triggered if a teen repeatedly searches for concerning terms within a short time frame. Parents enrolled in Instagram’s supervision program will receive these notifications via email, text, or WhatsApp, as well as through in-app alerts. The messages will clearly explain the nature of the searches and provide expert resources to foster sensitive discussions.
Search Terms Leading to Alerts
- Phrases suggesting self-harm
- Terms like “suicide” or “self-harm”
- Any expression promoting self-harm
These alerts are being rolled out in the United States, the United Kingdom, Australia, and Canada, with plans for expansion into other regions later in the year.
Balancing Awareness and Anxiety
Instagram aims to support parents while avoiding excessive notifications that may cause unnecessary alarm. The threshold for sending alerts is based on analyzing search patterns and consultations with experts from a dedicated advisory group on suicide and self-harm.
Experts, such as Dr. Sameer Hinduja, commend this initiative, highlighting its importance in empowering parents to intervene when needed. Meanwhile, Vicki Shotbolt, CEO of Parent Zone, emphasizes the significance of parents being informed about their teens’ online activities, which can give them more peace of mind.
Ongoing Commitment to Teen Safety
This new feature builds on existing protective measures that Instagram has implemented to shield teens from harmful content. The platform has strict policies against content that glorifies self-harm, which are actively enforced.
Instagram not only blocks searches for suicide and self-harm-related terms but also redirects users to appropriate resources and local organizations for support. Additionally, the platform will notify emergency services if there is an imminent risk of harm based on user activity.
Future Enhancements
Instagram acknowledges that teens are increasingly seeking support from AI. As a part of its ongoing efforts, the platform is developing parental alerts for conversations involving suicide or self-harm that teens may engage in with AI. Further details on these developments will be released in the coming months.
Through this initiative, Instagram demonstrates its commitment to empowering parents and fostering a safer online environment for teens who may be struggling with mental health issues.




