Instagram to alert parents if teens search for self-harm and suicide content

Richard Morrisand
Liv McMahon,Technology reporters
News imageGetty Images A smartphone screen showing social media platform instagram and hand holding the phone with instagram logo in the backgroundGetty Images

Parents using Instagram's child supervision tools will soon receive alerts if their teen repeatedly searches for suicide or self-harm related terms on the platform.

It is the first time parent company Meta will proactively alert parents to searches by their child on Instagram for harmful material, rather than just block searches and direct users to external help.

Parents and teens enrolled in Instagram's Teen Accounts experience in the UK, US, Australia and Canada will be notified about the alerts from next week, with the rest of the world to follow later.

But suicide prevention charity the Molly Rose Foundation has strongly criticised the measures, warning they "could do more harm than good".

"This clumsy announcement is fraught with risk and we are concerned that forced disclosures could do more harm than good," said its chief executive Andy Burrows.

The organisation was established by the family of Molly Russell, who took her own life in 2017 at the age of 14 after viewing self-harm and suicide content on platforms including Instagram.

Burrows said "every parent would want to know if their child is struggling, but these flimsy notifications will leave parents panicked and ill-prepared to have the sensitive and difficult conversations that will follow".

Meta says alerts to parents about their child searching for suicide and self-harm material within a short space of time on Instagram will also be accompanied by expert resources to help them navigate difficult conversations.

However, Molly Russell's father Ian, who set up the Molly Rose Foundation in her honour, remains sceptical about the alerts.

"Imagine being a parent of a teenager and getting a message at work saying 'your child is thinking of ending their life'... I don't know how I'd react," he told the BBC.

"And even if Meta say they're going to supply support to that parent, in that moment of panic when you hear that about your child, I don't think that's a very sensible way of doing things."

'Neglecting the real issue'

A number of charities including the Molly Rose Foundation have said Meta's announcement is almost an acknowledgment that more could be done to protect children on Instagram.

Ged Flynn, chief executive of charity Papyrus Prevention of Young Suicide, said while it welcomed Instagram's announcement, Meta was "neglecting the real issue that children and young people continue to be sucked into a dark and dangerous online world".

"Parents contact us every day to say how worried they are about their children online," he told the BBC.

"They don't want to be warned after their children search for harmful content, they don't want it to be spoon-fed to them by unthinking algorithms."

Meanwhile Leanda Barrington-Leach, executive director at children's charity 5Rights, said "if Meta is to take child safety seriously, it needs to return to the drawing board and make its systems age-appropriate by design and default".

Burrows also cited prior research by the Foundation which indicated Instagram still "actively" recommends harmful content about depression, suicide and self-harm to "vulnerable young people".

"The onus should be on addressing these risks rather than making yet another cynically timed announcement that passes the buck to parents," he added.

Meta disputed the organisation's findings published last September, saying it "misrepresents our efforts to empower parents and protect teens".

Increased scrutiny

Instagram's Teen Account alerts are designed to tell parents if there is a sudden change in their child's behaviour and search habits on the platform.

Meta said in a blog post the measures build on Instagram's existing teen protections, which include hiding material relating to suicide or self-harm on the app and blocking searches for harmful or dangerous content.

Alerts will be sent to parents by email, text, WhatsApp or on the Instagram app itself, depending on what contact information Meta has for families.

News imageMeta Two screenshots of how the alerts will look in the Instagram app. The first has the title "alert about your teen's safety" and the second is titled "how you can support your teen".Meta
Meta says these are the kinds of alerts parents will receive

Meta says Instagram's new alerts - stemming from its analysis of user search patterns - may occasionally alert parents when there is no cause for concern and will "err on the side of caution".

Sameer Hinduja, co-director of the Cyberbullying Research Center, said the alert would "obviously" be alarming for any parent to receive.

But he told the BBC "what matters is not just the alert itself but the quality and usefulness of the resources parents immediately receive to guide them through what to do next".

"You can't drop a notification on a parent and leave them on their own, and it seems like Meta understands that," Hinduja added.

Instagram says in coming months it will also look to apply similar alerts if teens discuss self-harm and suicide with its AI chatbot as children "increasingly turn to AI for support".

Social media companies are facing increasing pressure from governments worldwide to make their platforms safer for children.

At the start of the year, Australia banned social media for under-16s - with Spain, France and the UK considering similar steps.

Regulators and lawmakers are meanwhile closely scrutinising big tech's business practices towards young users.

Meta boss Mark Zuckerberg and Instagram chief Adam Mosseri recently appeared in court in the US to defend the company against claims it targeted younger users.

Additional reporting by James Kelly

If you have been affected by the issues raised in this article, help and support is available via BBC Action Line.

News imageA green promotional banner with black squares and rectangles forming pixels, moving in from the right. The text says: “Tech Decoded: The world’s biggest tech news in your inbox every Monday.”

Sign up for our Tech Decoded newsletter to follow the world's top tech stories and trends. Outside the UK? Sign up here.