How Molly Russell's family held Big Tech accountable

'Demented trail of life-sucking content': Ian Russell speaks after landmark ruling on his daughter's death

Ian Russell broke down in tears after the inquest that found social media content had contributed to his 14-year-old daughter's death.

Its landmark ruling marked the accumulation of five years of the Russell family campaigning and fighting for social media firms to face their responsibilities after Molly Russell died in 2017.

On Friday, the long-awaited inquest into her death concluded the schoolgirl from Harrow, north-west London, had died "from an act of self-harm while suffering depression and the negative effects of online content".

It's the first time technology platforms have been formally held responsible for the death of a child, Mr Russell told a press conference after the coroner's conclusion was delivered.

Molly, pictured as a young child, died after viewing content related to depression, suicide and self harm on social media Credit: Family handout

But the content shown in court represented "just a fraction of the Instagram posts" seen by his depressed daughter in the year of her death, Mr Russell said.

The family's lawyer said that it took until last month for Meta, which owns Instagram, Facebook and Whatsapp, to provide the inquiry with over 1,200 Instagram posts Molly engaged with.

These included some of the most distressing videos and posts viewed by the schoolgirl.

Mr Russell said he hoped the conclusion would be an “important step in bringing about much-needed change” and asked Meta chief Mark Zuckerberg to “just listen… and then do something about it”.

He was unequivocal in his criticism of the social media giant's claims that the content was "safe".

"We have heard a senior Meta executive describe this deadly stream of content the platform’s algorithms pushed to Molly as ‘safe’ and not contravening the platform’s policies," he told reporters.

"If this demented trail of life-sucking content was safe, my daughter Molly would probably still be alive and instead of being a bereaved family of four, there would be five of us looking forward to a life full of purpose and promise that lay ahead for our adorable Molly."

"They didn’t really consider anything to do with safety," he said.

'That’s the monster that has been created'

"Sadly their products are misused by people and their products aren’t safe," Mr Russell said. "That’s the monster that has been created but it’s a monster we must do something about to make it safe for our children in the future.”

"It’s the corporate culture that needs to change, so that they put safety first instead of profits."

He called for change at what he described as a "toxic corporate culture" at the heart of Meta.

"It’s time for the government’s Online Safety Bill to urgently deliver its long-promised legislation," he said.

"It’s time to protect our innocent young people, instead of allowing platforms to prioritise their profits by monetising their misery."

Following the conclusion, Meta said in a statement that the company is “committed to ensuring that Instagram is a positive experience for everyone, particularly teenagers”.

A spokeswoman said the firm would “carefully consider the coroner’s full report when he provides it”.

Molly Russell took her own life after viewing harmful content online. Credit: Family handout/PA

Mr Russell welcomed Pinterest's admission that the site was not safe when his daughter viewed "depression pins" and said the firm has "engaged sincerely and respectfully" with the inquest.

Merry Varney, representing Molly’s family from law firm Leigh Day, said the family have "welcomed the transparency shown by Pinterest, as well as their acceptance and apology for the deeply harmful material Molly was able to access".

The photo-sharing platform has "taken steps" to learn lessons from Molly's death in a way he hopes other social media firms would in any future inquiries, Mr Russell said.

"Seeking to find out how your loved one died should never be a battle," said Ms Varney.

She said that the inquest concluded children’s lives "remain at risk", which the family hopes "is acted upon urgently, so that tech giants can no longer invite [children] onto wholly unsafe and harmful platforms."

Ms Varney added that it remained to be seen how firms ultimately responded to the coroner’s findings of fact.

She said she hoped “Meta in particular” followed through after offering to meet with the Molly Rose Foundation, a charity set up by the family with the aim of preventing suicide, and that “they listen very carefully and have some humility”.

Mr Russell added that the measures currently taken to ensure safeguarding online have been “tiny”.

“I think that those steps are tiny and I think that globally dominant platforms can move a lot faster,” he said.

“I think we should all be looking at them and judging them on their actions.”

Molly Russell died in November 2017, aged 14 Credit: Family handout/PA

'We told this story in the hope that change would come about'

"Thank you Molly, for being my daughter. Thank you." an emotional Mr Russell said as he concluded the conference.

He previously thanked the coroner, the press, and Molly’s friends and family for enabling her story to be told.

“We shouldn’t be sitting here. This is a story about one person, but that person has affected one family and their friends and maybe the wider world in some way.

“We should not be sitting here. This should not happen because it does not need to happen.

“We told this story in the hope that change would come about.”

The ruling has been described as a global first of its kind and was called social media's "big tobacco moment" by the NSPCC.

Andy Burrows, head of child safety online policy at the children's charity, said: "For the first time globally, it has been ruled content a child was allowed and even encouraged to see by tech companies contributed to their death.

"The world will be watching their response."

On the evening of the inquest findings, a peer announced that an Online Safety Bill amendment will be brought forward to help bereaved parents access information about social media companies.

Baroness Beeban Kidron said she will table a change to the proposed legislation in the House of Lords after a coroner concluded content viewed on the internet contributed to the schoolgirl’s death.

  • Samaritans provides round the clock support for people when they need it most. You can call them 24 hours a day on 116 123. They also have tips if you're concerned about someone you know, and advice if you're struggling yourself

  • Young people who need support or have any concerns about what they have seen or heard during the inquest can contact Childline on 0800 1111 or via

  • Adults concerned about a child or who needs advice about supporting a young person can contact the NSPCC Helpline on 0808 800 5000 or via

Want a quick and expert briefing on the biggest news stories? Listen to our latest podcasts to find out What You Need To Know...