suicide-linked terms
New Instagram feature warns parents if teens search suicide-linked terms often
Instagram will begin notifying parents if their children repeatedly search for terms linked to suicide or self-harm, the social media platform said Thursday. The alerts will only reach parents enrolled in Instagram’s parental supervision program.
The company said it already blocks such content from appearing in teen accounts’ search results and directs users to helplines. Alerts will be sent via email, text, WhatsApp, or through the parent’s Instagram account, depending on the contact information available. “Our goal is to empower parents to step in if their teen’s searches suggest they may need support,” Meta said in a blog post, adding that notifications will be carefully managed to avoid overuse, which could reduce their effectiveness.
eBay settles lawsuit over harassment campaign targeting online publishers
The announcement comes as Meta faces two ongoing trials over alleged harms to children. In Los Angeles, a trial examines whether Meta’s platforms intentionally addict and harm minors, while a New Mexico trial considers whether the company failed to protect children from sexual exploitation. Thousands of families, along with school districts and government entities, have sued Meta and other social media firms, claiming their platforms are designed to be addictive and expose children to content that may contribute to depression, eating disorders, and suicide.
Meta executives, including CEO Mark Zuckerberg, have denied that their platforms cause addiction. During questioning in Los Angeles, Zuckerberg said the scientific evidence does not prove social media harms mental health.
Meta also said it is developing similar notifications to alert parents if their teens engage in certain conversations with Instagram’s artificial intelligence tools related to suicide or self-harm. “This is important work, and we’ll have more to share in the coming months,” the company added.
8 hours ago