Coroner issues warning as Instagram videos of ‘most distressing nature’ shown to Molly Russell inquest

Videos shown to an inquest into the death of schoolgirl Molly Russell were so distressing a coroner considered editing them and issued the “greatest” warning before they were played.

The 14-year-old, from Harrow, northwest London, ended her life in November 2017 after viewing online content relating to self-harm, depression and suicide.

An inquest into her death at North London Coroner’s Court was shown 17 clips she liked or saved on Instagram which appeared to “glamorise harm to young people”.

Before the clips were played, Coroner Andrew Walker told those present to leave if they were likely to be affected by the material.

The court was told lawyers and the coroner had discussed whether to edit them before hand because they were “so uncomfortable to view”.

“But Molly had no such choice, so we would in effect be editing the footage for adult viewing when it was available in an unedited form for a child,” Mr Walker said.

Describing footage the court was about to see, the coroner said: “It’s of the most distressing nature and it’s almost impossible to watch.

“If you are likely to be affected by any such videos, please do not stay to view them.”

Turning to Molly’s family, the coroner said: “There’s no need for any of you to stay.

“In my view, this sequence of video footage ought to be seen [by the court].”

An inquest at North London Coroner’s Court was shown 17 clips Molly liked or saved on Instagram which appeared to ‘glamorise harm to young people’

(Family handout/PA)

The court was then played the clips, which were related to suicide, drugs, alcohol, depression and self-harm.

Molly’s family stayed in the courtroom as the videos were played, but the coroner elected to take a 15-minute break in proceedings afterwards.

The schoolgirl’s family has campaigned for better internet safety since her death almost five years ago.

Instagram’s guidelines at the time, which were shown to the court, said users were allowed to post content about suicide and self-harm to “facilitate the coming together to support” other users but not if it “encouraged or promoted” self-harm.

On Friday, the head of health and wellbeing at Instagram’s parent company Meta defended the social media platform’s content policies – saying suicide and self-harm material could have been posted by a user as a “cry for help”.

Elizabeth Lagone, head of health and wellbeing at Instagram’s parent company Meta, defended the social media platform’s content policies

(Beresford Hodge/PA)

Elizabeth Lagone told the court it was an important consideration of the company, even in its policies at the time of Molly’s death, to “consider the broad and unbelievable harm that can be done by silencing (a poster’s) struggles”.

Ms Lagone also denied Instagram had treated children such as Molly as “guinea pigs” when it launched content ranking – a new algorithm-driven system for personalizing and sorting content – in 2016.

Molly’s family’s lawyer, Oliver Sanders KC, said: “It’s right isn’t it that children, including children suffering from depression like Molly, who were on Instagram in 2016 were just guinea pigs in an experiment?”

She replied: “That is specifically not the way we develop policies and procedures at the company.”

Asked by Mr Sanders whether it was obvious it was not safe for children to see “graphic suicide imagery”, the executive said: “I don’t know … these are complicated issues.”

Mr Sanders drew the witness’s attention to experts who had informed Meta it was not safe for children to view the material, before asking: “Had they previously told you something different?”

Molly Russell’s father Ian Russell (centre), mother Janet Russell (right) and her sister (left) arriving at Barnet Coroner’s Court on the first day of the inquest into her death

(Kirsty O’Connor/PA)

Ms Lagone responded: “We have ongoing discussions with them but there are any number of … issues we talk about with them.”

The court heard Molly set up an Instagram account in March 2015 when she was 12 and was recommended 34, “possibly more”, sad or depressive-related accounts on Instagram.

Of the accounts recommended, Mr Sanders said one referred to self-injury, one to concealment, four to suicidal feelings, one to themes of being “unable to carry on”, two to mortality and one to burial.

On Thursday, Pinterest’s head of community operations, Judson Hoffman, apologized after admitting the platform was “not safe” when the 14-year-old used it.

Mr Hoffman said he “deeply regrets” posts viewed by Molly on Pinterest before her death, saying it was material he would “not show to my children”.

The inquest, due to last up to two weeks, continues.

If you are experiencing feelings of distress and isolation, or are struggling to cope, The Samaritans offers support; you can speak to someone for free over the phone, in confidence, on 116 123 (UK and ROI), email [email protected]or visit the samaritans website to find details of your nearest branch.

For local services to you, the national mental health database – Hub of Hope – allows you to enter your postcode to search for organizations and charities who offer mental health advice and support in your area.

Additional reporting by Press Association

Leave a Comment