Instagram to alert parents if teens repeatedly search for suicide terms

Instagram on Thursday announced a new feature strengthening teen protection, saying it will start alerting parents if their children repeatedly search for terms related to suicide or self-harm in a short period of time.

Instagram already blocks such content from showing up on teenagers’ accounts as search results. (AP)

These alerts will, however, only go to parents who have enrolled for the app’s parental supervision program, the Associated Press reported.

This comes even as the social media platform’s parent company, Meta, is in the midst of two trials over mental health harms to children. Instagram already blocks such content from showing up on teenagers’ accounts as search results, and directs people to use helplines instead.

Also Read | Instagram, YouTube ‘engineered addiction’ in children: US lawyer’s claim

The parental alerts will be sent through email, text or WhatsApp, based on the contact information available and provided by them. A notification regarding this will also be sent to the parents’ Instagram account, AP reported.

The alerts will be rolled out in the coming weeks in United States, Britain, Australia and Canada, and will be expanded to other regions later this year, according to AFP.

Notification won’t be sent ‘unnecessarily’: Meta

In a blog post regarding the new protection, Meta said that it wants to “avoid sending these notifications unnecessarily”, stating that it is aware that if it is overdone, it would make them less useful.

“Our goal is to empower parents to step in if their teen’s searches suggest they may need support,” Meta said, adding that it was also working to incorporate similar notifications to parents regarding their kids’ interactions with artificial intelligence. “These will notify parents if a teen attempts to engage in certain types of conversations related to suicide or self-harm with our AI,” the tech company said.

Meta further said that this is “important work”, adding that it will have more to share regarding this in the coming months.

Meta, other social media firms sued by thousands of families over addiction, harms

Meta and other social media companies are being sued by thousands of families, along with school districts and government entities, over claims that they deliberately design their apps to be addictive, AP reported.

The plaintiffs have also alleged that these platforms do not have enough safeguards to protect kids from content which can lead to depression, eating disorders and suicide.

Meta is currently facing two trials, with one Los Angeles questioning whether Meta’s platforms deliberately induce addiction and harm minors. A second trial underway in New Mexico will decide whether Meta has failed to protect kids from sexual exploitation on its platforms, according to AP.

Discussing suicides can be triggering for some. However, suicides are preventable. A few major suicide prevention helpline numbers in India are 011-23389090 from Sumaitri (Delhi-based) and 044-24640050 from Sneha Foundation (Chennai-based).

Leave a Comment