A senior executive at Instagram’s parent company has defended the platform’s policies on suicide and self-harm content, telling the inquest into Molly Russell’s death that guidelines had always been drafted in consultation with experts.
Elizabeth Lagone, head of health and wellbeing policy at Meta, said the social media group worked “extensively with experts” when writing guidelines, which allow users to discuss feelings related to suicide or self-harm.
Molly, 14, from Harrow, north-west London, killed herself in November 2017 after viewing extensive amounts of material on platforms including Instagram related to suicide, depression, self-harm and anxiety.
Lagone told North London coroner’s court that at the time of Molly’s death, users were allowed to post content about suicide and self-harm to “facilitate the coming together to support” other users but not if it “encouraged or promoted” such actions.
In February 2019, Instagram changed its guidelines to ban “all graphic suicide and self-harm content”. It still allows users to “talk about their own feelings related to suicide or self-harm” provided such content is not graphic, promotional, or shows methods or materials.
In a witness statement submitted to the court, Lagone said: “Experts have consistently told us that in the right circumstances, content which touches on suicide and self-injury can be shared in a positive context and can play an important role in destigmatizing mental health. difficulties.”
Lagone said suicide and self-harm material could have been posted by a user as a “cry for help”. She also told the court it was an important to the company that it “considers the broad and unbelievable harm that can be done by silencing [a user’s] struggles.” Lagone, who is based in the US, had been ordered to attend in person by the senior coroner, Andrew Walker.
Oliver Sanders KC, representing the Russell family, asked Lagone if Instagram had treated young users like Russell as “guinea pigs” when it introduced a system known as content ranking in 2016. Under content ranking, users are sent posts that might be of interest to them, based on factors including what content they like and comment on. Instagram has a minimum age limit of 13.
“It’s right isn’t it that children, including children suffering from depression like Molly who were on Instagram in 2016, were just guinea pigs in an experiment?” said Sanders. Lagone replied: “That is specifically not the way we develop policies and procedures at the company.”
Addressing Instagram’s policy on banning glorification of self-harm but allowing users to create awareness of it, Sanders also asked: “Do you think an average 13-year-old would be able to tell the difference between encouraging and promoting self-injury and creating awareness of self-injury?”
Lagone replied: “I can’t answer that question because we do not allow content that encourages self-harm.”
The court also heard that Instagram had recommended Molly follow at least 34 accounts with “sad or depressive” content-related handles. Four related to suicidal feelings, two to mortality, with other recommendations relating to self-injury, being close to death, suicidal feeling and burial.
Earlier on Friday morning, the inquest was shown videos on Instagram “of the most distressing nature” that were seen by the teenager before she took her own life.
The court was shown 17 video clips that Molly had saved or liked on Instagram before she died. Walker warned that the footage “appears to glamorize harm to young people” and is “of the most distressing nature and it is almost impossible to watch”.
The court was then shown a series of graphic video montages which showed people in suicidal situations. The montages were edited to music and some were captioned with references to suicide. Molly’s family decided to stay in the courtroom as the videos were played, but the coroner elected to take a 15-minute break in proceedings afterwards.
On Thursday, a senior executive at the image-sharing platform Pinterest admitted the platform was “not safe” when Molly Russell used it and apologized over the graphic material shown by the service to the teenager before her death.
The quest continues.
In the UK, the youth suicide charity Papyrus can be contacted on 0800 068 4141 or email [email protected], and in the UK and Ireland Samaritans can be contacted on freephone 116 123, or email [email protected] or jo @samaritans.ie. In the US, the National Suicide Prevention Lifeline is at 800-273-8255 or chat for support. You can also text HOME to 741741 to connect with a crisis text line counsellor. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at befrienders.org