
Older coroner Andrew Walker cited self-harm as the cause of death for 14-year-old Molly Rose Russell due to depression and the negative effects of online content. Writes about it BBC.
“It would not be safe to leave suicide as a conclusion [ΠΎ ΠΏΡΠΈΡΠΈΠ½Π°Ρ ΡΠΌΠ΅ΡΡΠΈ]’ Walker said.
In the UK, a coroner’s opinion is tantamount to a judgment.
The expert said that Instagram and Pinterest used algorithms that led to βbinge periodsβ of materials, some of which the platforms chose and provided to the schoolgirl without her request.
βSome content romanticized self-harm by young people, while others contributed to isolation and prevented discussion of the issue with those who could help,β Walker said.
According to The Guardian, in 2017, on the eve of her death, Russell saved, liked, or shared over 2,000 Instagram posts related to suicide, depression, or self-harm. The girl also watched 138 videos of a similar nature, including episodes with a rating of “15+” and “18+” from the series “13 Reasons Why”.
A consulting child psychiatrist said at a hearing that he had been unable to sleep properly for several weeks after studying Instagram content that Russell had seen shortly before her death.
In the girl’s Pinterest account, investigators found hundreds of images related to self-harm and suicide. It also emerged that the platform was sending content recommendation emails to the schoolgirl with headlines such as β10 pins about depression you might like.”
βIt is likely that the above material, viewed by Molly, already suffering from a depressive illness and vulnerability due to her age, negatively affected her and more than minimally contributed to the death of the child,β Walker said.
Representatives from Meta and Pinterest apologized and acknowledged that Russell was exposed to content on the platforms that shouldn’t be there.
βWe are committed to ensuring that Instagram provides a positive experience for everyone, especially teens. We will take a close look at the full coroner’s report when he delivers it,” a Meta spokesperson said.
Pinterest noted that they are constantly improving the platform and strive to ensure safety for everyone.
“We will review the coroner’s report with caution,” the company said.
Recall that in May, TikTok was sued because of the alleged βdeadly recommendationsβ of its algorithms.
In December 2021, Amazon’s Alexa virtual assistant challenged a 10-year-old child to complete a death challenge.
Subscribe to Cryplogger news in Telegram: Cryplogger AI – all the news from the world of AI!
Found a mistake in the text? Select it and press CTRL+ENTER

Older coroner Andrew Walker cited self-harm as the cause of death for 14-year-old Molly Rose Russell due to depression and the negative effects of online content. Writes about it BBC.
“It would not be safe to leave suicide as a conclusion [ΠΎ ΠΏΡΠΈΡΠΈΠ½Π°Ρ ΡΠΌΠ΅ΡΡΠΈ]’ Walker said.
In the UK, a coroner’s opinion is tantamount to a judgment.
The expert said that Instagram and Pinterest used algorithms that led to βbinge periodsβ of materials, some of which the platforms chose and provided to the schoolgirl without her request.
βSome content romanticized self-harm by young people, while others contributed to isolation and prevented discussion of the issue with those who could help,β Walker said.
According to The Guardian, in 2017, on the eve of her death, Russell saved, liked, or shared over 2,000 Instagram posts related to suicide, depression, or self-harm. The girl also watched 138 videos of a similar nature, including episodes with a rating of “15+” and “18+” from the series “13 Reasons Why”.
A consulting child psychiatrist said at a hearing that he had been unable to sleep properly for several weeks after studying Instagram content that Russell had seen shortly before her death.
In the girl’s Pinterest account, investigators found hundreds of images related to self-harm and suicide. It also emerged that the platform was sending content recommendation emails to the schoolgirl with headlines such as β10 pins about depression you might like.”
βIt is likely that the above material, viewed by Molly, already suffering from a depressive illness and vulnerability due to her age, negatively affected her and more than minimally contributed to the death of the child,β Walker said.
Representatives from Meta and Pinterest apologized and acknowledged that Russell was exposed to content on the platforms that shouldn’t be there.
βWe are committed to ensuring that Instagram provides a positive experience for everyone, especially teens. We will take a close look at the full coroner’s report when he delivers it,” a Meta spokesperson said.
Pinterest noted that they are constantly improving the platform and strive to ensure safety for everyone.
“We will review the coroner’s report with caution,” the company said.
Recall that in May, TikTok was sued because of the alleged βdeadly recommendationsβ of its algorithms.
In December 2021, Amazon’s Alexa virtual assistant challenged a 10-year-old child to complete a death challenge.
Subscribe to Cryplogger news in Telegram: Cryplogger AI – all the news from the world of AI!
Found a mistake in the text? Select it and press CTRL+ENTER