- By Angus Crawford
- BBC News
Social media companies continue to serve “harmful content to millions of young people,” said Ian Russell, Molly Russell’s father.
He said he was horrified by the scale of the problem and that “not much has changed” since Molly took her own life aged 14. He fears more young lives will be lost.
A new study from the Molly Rose Foundation shows that young users can still access content about suicide and self-harm.
Social media platforms say they are working hard to keep teens safe.
The sites subject to research by the foundation created in Molly’s name – TikTok, Instagram and Pinterest – said they had created new tools to limit access to harmful material.
Molly, who took her own life after being exposed to a stream of dark and depressing content on Pinterest and Instagram, would have turned 21 this week.
A survey last year concluded that she ended her life while suffering from depression and the negative effects of online content.
A foundation researcher assessed more than 1,000 individual posts and videos, identified by searching 15 hashtags associated with harmful material and with which Molly was known to interact.
Data experts Bright Initiative helped analyze articles and videos published from 2018 to October of this year.
On Instagram, the study found that almost 50% of what they watched contained content that “displayed despair, feelings of misery, and very depressive themes.”
On TikTok, half of the posts examined containing “harmful content” were found to have been viewed more than a million times.
And, on Pinterest, the searcher was actively recommended several photos of “people standing on top of cliffs, drowning, stylized images of people free falling through the air.”
Online safety campaigner Mr Russell said “six years after Molly’s death this must now be seen as a fundamental systemic failure which will continue to cost young lives”.
Meta, which owns Instagram, said it has worked hard with experts and has “created more than 30 tools to help teens and families, including our Sensitive Content Control, which limits the type of content recommended to teens.”
A Pinterest spokesperson said it was “committed to creating a safe platform for everyone” and was constantly updating its policies and enforcement practices regarding self-harm content, ” including blocking sensitive search terms and evolving our machine learning models so that this content is detected and removed. as quickly as possible.”
A TikTok spokesperson said “content that encourages self-harm or suicide is prohibited” on the site, adding: “As the report highlights, we strictly enforce these rules by removing 98% of suicidal content before ‘they are not reported to us’.
It said it provides “access to Samaritans directly from our app to anyone who may need help” and invests in “ways to diversify recommendations” and “block harmful search terms.”
The study acknowledged that platforms had made limited efforts to improve security.
Following Molly’s death, Instagram announced a series of changes that, according to the report, “had a welcome targeted impact.”
TikTok, he says, “seems to enforce its community standards more effectively than some other platforms.” And “some improvements had been made” by Pinterest.
But overall, the report identifies issues across all three platforms:
- A failure to adequately address harmful materials and how they are recommended
- A design that increases exposure to negative content through, for example, hashtag suggestions
- Algorithms that actively spread harmful content
- Community standards too narrow
Professor Louis Appleby, government adviser on suicide prevention and professor of psychiatry at the University of Manchester, said of the research: “We have evolved in the way we view the world online.
“We are in a new era of social responsibility and tech companies need to do more about their images and algorithms.”
The government believes that the online safety law, which came into force last month, should solve these kinds of problems.
Regulator Ofcom is currently developing codes of practice which it expects tech companies to follow and which will be enforceable by law.