Premium
This is an archive article published on October 3, 2022

Why a UK court has blamed tech companies for a 14-year-old girl’s suicide

The UK court ruling has been called “social media’s big tobacco moment.” We explain the Molly Russell case.

Molly Russell case, who is Molly Russell, social media blamed for teen's deathDuring the two-week inquest into Molly Russell’s death, it was revealed that while she seemed to be doing well at school, in reality, she was suffering from depressive illness. (Photo: Twitter/@mollyroseorg)

A London coroner on Friday (September 30) ruled that harmful social media content contributed to a teenager’s death in 2017 “in a more than minimal way.”

This ruling is perhaps the first of its kind to directly blame social media platforms for a child’s death.

Molly Russell, a 14-year-old schoolgirl from London, died by suicide in 2017 after viewing online content about suicide and self-harm on platforms like Instagram and Pinterest.

Story continues below this ad

“It would not be safe to leave suicide as a conclusion. She died from an act of self-harm while suffering from depression and the negative effects of online content”, senior coroner Andrew Walker on Friday. The sites she frequented “were not safe as they allowed access to adult content that should not have been available for a 14-year-old child to see,” he added.

What did the investigation reveal?

During the two-week inquest into Molly Russell’s death, it was revealed that while she seemed to be doing well at school, in reality, she was suffering from depressive illness.

For six months before she died, Molly had saved, liked or shared 16,300 pieces of content on Instagram, of which more than 2,100, or 12 a day were related to suicide, self-harm and depression, The New York Times reported. It was revealed that she also formed a digital pinboard on Pinterest with more than 400 images of similar subjects, The Guardian reported.

Walker told the court that Instagram and Pinterest had used algorithms that led to accessing “binge periods” of harmful material, some of which she had never requested.

Story continues below this ad

“These binge periods are likely to have had a negative effect on Molly,” Walker said. “Some of this content romanticised acts of self-harm by young people on themselves. Other content sought to isolate and discourage discussion with those who may have been able to help.”

What did online platforms say?

Senior executives from Pinterest and Meta — the parent company of Instagram — were ordered to attend the inquest in person. They had previously told a pre-inquest hearing that they would provide evidence remotely, citing short notice, Covid risks and busy work schedules, according to The Guardian. However, Walker stated that they would be required to look at video footage and documents, which would require them to be present in court.

Judson Hoffman of Pinterest apologised for some of the content the teenager had viewed and agreed that Pinterest was “not safe” when she had used it. He stated that the platform now uses artificial intelligence to remove harmful content.

Elizabeth Lagone, Meta’s head of health and wellbeing policy said that during the inquest, content about suicide and self-harm, which Molly had accessed before her death, was “safe.” However, she admitted that some of the posts Molly had viewed could have violated Instagram’s policies, The Guardian reported.

Story continues below this ad

While Lagone said she was sorry that Molly had viewed distressing content, she nonetheless claimed that it was important for online platforms to allow people to express their feelings.

A ‘tobacco moment’ for social media

The inquest into Molly Russell’s death was significant because for the first time, senior executives from Meta and Pinterest were summoned and they gave evidence under oath in a court in the UK, according to the BBC.

Andy Burrows, the head of child safety policy at The National Society for the Prevention of Cruelty to Children (NSPCC), called the ruling “social media’s big tobacco moment.” He added, “for the first time globally, it has been ruled that content a child was allowed and encouraged to see by tech companies contributed to their death.”

This is not the first time social media platforms have been accused of promoting dangerous content to children, at times with deadly consequences. In July, TikTok faced a lawsuit by the parents of two young girls in the US, who died after taking part in the viral “blackout challenge” in 2021. The platform was accused of “intentionally” providing the children with dangerous videos.

Story continues below this ad

Michele Donelan, serving as the UK’s Secretary of State for Digital, Culture, Media and Sport, said that the inquest had “shown the horrific failure of social media platforms to put the welfare of children first.”

Calling the Online Safety Bill the answer to this problem, Donelan stated that “through it, we will use the full force of the law to make social media firms protect young people from horrendous pro-suicide material.”

Proposed changes to the Online Safety Bill

The Online Safety Bill seeks to improve internet safety, while also allowing defending freedom of expression. Introduced to the UK parliament in March 2022, the bill, which attempts to establish rules about online platforms dealing with harmful content, was an amended version of the draft first introduced in May 2021.

Among other things, the bill aims to prevent the spread of illegal content and protect children from harmful material. Platforms that are likely to be searched by children will need to tackle content that threatens their limits, which includes but is not limited to posts that promote self-harm or suicide.

Story continues below this ad

Companies that do not comply with the rules will face fines of up to £18m or 10% of global annual turnover (whichever is higher), according to the bill.

Calling the court’s ruling historic, Baroness Beeban Kidron, a member of the House of Lords, said that she would table a change to the proposed Online Safety Bill, after the inquest into Molly Russell’s concluded, as reported by The Independent.

“And we do know, and I’m afraid my inbox in Parliament is full of people who have lost children sadly, and many of them struggle to get the information that they want, to get the access, to get that transparency…And I will be bringing forward an amendment to the Online Safety Bill in the House of Lords that seeks to make it easier for bereaved parents to access information from social media companies”, she said.

Latest Comment
Post Comment
Read Comments
Advertisement
Advertisement
Advertisement
Advertisement