Sitting in the witness box of a small London courtroom last week, a Meta executive faced an uncomfortable question: Did her company contribute to the suicide of a 14-year-old named Molly Russell?
Videos and images of suicide, self-harm and depressive content that the teenager viewed in the months before she died in November 2017 appeared on a screen in the courtroom. The executive was read a post that Molly had liked or saved from Instagram, and heard how it was copied almost verbatim in a note filled with words of self-loathing later found by her parents.
“This is Instagram literally giving Molly ideas,” Oliver Sanders, a lawyer representing the family, said angrily during one moment of the exchange.
Leaning forward in the witness chair, the executive, Elizabeth Lagone, who leads the company’s health and well-being policy, responded: “I can’t speak to what was going on in Molly’s mind.”
The coroner overseeing the case, who in Britain is a judge-like figure with wide authority to investigate and officially determine a person’s cause of death, was far less circumspect. On Friday, he ruled that Instagram and other social media platforms had contributed to her death – perhaps the first time anywhere that internet companies have been legally blamed for a suicide.
“Molly Rose Russell died from an act of self-harm while suffering from depression and the negative effects of online content,” said coroner Andrew Walker. Rather than officially classify her death a suicide, he said the internet “affected her mental health in a negative way and contributed to her death in a more than minimal way.”
The dispassionate and declarative judgment concluded a legal battle that pitted the Russell family against some of Silicon Valley’s largest companies. Delving into many parents’ worst fears about the influence of the internet and social media on their children, the case reverberated in Britain and beyond. A crowd of television cameras and photographers gathered outside the courtroom when the decision was announced.
‘This was David and Goliath’
Thousands of images, videos and other social media material from Molly’s accounts were revealed during the investigation, one of the largest public releases of its kind. That provided the sort of detail that researchers studying the mental health effects of social media have long complained that platforms like Meta, which owns Facebook and Instagram, withhold on privacy and ethical grounds.
Molly’s social media use included material so upsetting that one courtroom worker stepped out of the room to avoid viewing a series of Instagram videos depicting suicide. A child psychologist who was called as an expert witness said the material was so “disturbing” and “distressing” that it caused him to lose sleep for weeks.
The companies face no financial or other penalty because of the decision. The case was a coroner inquest to determine a cause of death, not a criminal or civil trial. The family said it had pursued the case as a form of justice for Molly and to raise awareness about youth suicide and the dangers of social media.
But already, a draft law partly inspired by Molly’s death, to force social media companies to adopt new child safety protections or risk heavy fines, is winding its way through Britain’s parliament. Instagram and Pinterest restricted access to some suicide and self-harm content. And lawyers representing American families that are suing TikTok and Meta for contributing to their children’s deaths are pointing to the outcome as a precedent.
“This was David and Goliath,” said Beeban Kidron, a member of the House of Lords and founder of 5Rights, a non-profit pushing for stricter online child-safety laws. “The Russell family have battled for five years to get the companies into an environment where under oath they had to account for their actions.”
Meta, which said during the inquest that it had never studied the effects of suicidal and depressive Instagram content on its youngest users, said in a statement afterward that its “thoughts are with the Russell family” and that it was “committed to ensuring that Instagram is a positive experience for everyone, particularly teenagers”.
The Russell family had an “almost boring” life in a north London suburb, said Molly’s father, Ian Russell, in an interview in July. Worried about their three daughters’ use of technology, he and his wife, Janet, attended e-safety classes at their school and tried to keep tabs on their social media accounts. Phones were banned at the dinner table.
Molly, like her two older sisters, got a basic phone at age 11, when many British children begin commuting to school independently. She received an iPhone as a 13th birthday present, not long after she had created an Instagram account with her parents’ permission.
Molly, who had enjoyed horseback riding and pop music, began spending more time in her room, but nothing raised alarms. Russell said she had rarely posted anything publicly on social media, but it was not uncommon to find her sitting on her bed watching Netflix on an iPod Touch while messaging with her friends on another device.
“She was a teenager; it almost would have been worrying if she didn’t,” Russell said. “How you split those things from normal behaviour and maybe something of concern, I really don’t know that you can.”
In the days after Molly’s death, Russell said, the family struggled to understand what had gone wrong. She had been a bit downcast for parts of the past year, but had perked up of late. The family attributed the mood swings to normal adolescent behaviour.
‘Molly Rose Russell died from an act of self-harm while suffering from depression and the negative effects of online content.’
British coroner Andrew Walker
The night before she died, the family watched a reality-TV show together, and Molly asked her father for help with a work-experience project. She was excited about tickets to see Hamilton and to play a lead part in an upcoming school play.
It was only when her father sat down with the family computer that pieces began coming together. After he gained access to her Instagram account, he found a folder titled “Unimportant things” with dozens of troubling images and quotes. “Who would love a suicidal girl?” one said.
He gasped while reviewing Molly’s email inbox, where he found a note from Pinterest that arrived about two weeks after her death. “Depression Pins you may like,” it said.
In January 2019, Russell went public with Molly’s story. Outraged that his young daughter could view such bleak content so easily and convinced that it had played a role in her death, he sat for a TV interview with the BBC that resulted in front-page stories across British newsstands.
Russell, a television director, urged the coroner reviewing Molly’s case to go beyond what is often a formulaic process and to explore the role of social media. Walker agreed after seeing a sample of Molly’s social media history.
That resulted in a years-long effort to get access to Molly’s social media data. The family did not know her iPhone passcode, but the London police were able to bypass it to extract 30,000 pages of material. After a lengthy battle, Meta agreed to provide more than 16,000 pages from her Instagram, such a volume that it delayed the start of the inquest. Merry Varney, a lawyer with the Leigh Day law firm who worked on the case through a legal aid program, said it had taken more than 1000 hours to review the content.
What they found was that Molly had lived something of a double life. While she was a regular teenager to family, friends and teachers, her existence online was much bleaker.
‘A ghetto of the online world’
In the six months before Molly died, she shared, liked or saved 16,300 pieces of content on Instagram. About 2100 of those posts, or about 12 per day, were related to suicide, self-harm and depression, according to data that Meta disclosed to her family. Many accounts she interacted with were dedicated to sharing only depressive and suicidal material, often using hashtags that linked to other explicit content.
Many posts glorified inner struggle, hiding emotional duress and telling others “I’m fine”. Molly went on binges of liking and saving graphic depictions of suicide and self-harm, once after 3am, according to a timeline of her Instagram usage.
“It’s a ghetto of the online world that once you fall into it the algorithm means you can’t escape it and keeps recommending more content,” Russell said during testimony.
Molly did not talk about her struggles with family, but she sought solace from online influencers who posted regularly about sadness and suicide. From an anonymous Twitter account her family discovered later, Molly had reached out to at least one influencer about her despair — messages that never received a response.
Jud Hoffman, the head of community operations at Pinterest, said he “deeply regrets” that Molly had viewed explicit material that he would not want his own children to view. “I am sorry,” he said.
‘Demented trail of life-sucking content’
Meta acknowledged that Molly had seen some content that violated its policies, but defended its practices overall as a balance between free expression and safety. The company added new protections in 2019, after the family went public about Molly’s experience, including prohibiting graphic images of self-harm such as cutting, and providing links to resources for those looking at sad or depressive material.
Lagone, who has a background in public health and was hired by Meta in 2020, said that while she was sorry Molly had seen such distressing content it was important to give people space to express sadness openly as “a cry for help.”
After the final decision in the case was announced on Friday, Russell was still stewing about a comment made by Lagone during her testimony that some of the material viewed by Molly had been safe.
“If this demented trail of life-sucking content was safe,” he said, “my daughter Molly would probably still be alive.”
This article originally appeared in The New York Times.
For immediate support, phone Lifeline on 13 11 14, BeyondBlue on 1300 46 36 or the Kids Helpline on 1800 55 1800. Find a range of mental health tests, tools and support resources at beyondblue.org.au and blackdoginstitute.org.au.