Molly Russell was a 14-year-old schoolgirl who took her own life in 2017 after viewing harmful images on Instagram, a Facebook-owned social media platform.
She had entered a “dark rabbit hole of depressive suicidal content”, her father said.
Ian Russell holds Instagram partly responsible for his daughter’s death.
“I think Molly probably found herself becoming depressed,” he told BBC News last October.
“She was always very self-sufficient and liked to find her own answers. I think she looked towards the internet to give her support and help.
“She may well have received support and help, but what she also found was a dark, bleak world of content that accelerated her towards more such content.”
Mr Russell claimed the algorithms used by some online platforms “push similar content towards you” based on what you have been previously looking at.
He said: “I think Molly entered that dark rabbit hole of depressive suicidal content.
“Some were as simple as little cartoons – a black and white pencil drawing of a girl that said ‘Who would love a suicidal girl?’.
“Some were much more graphic and shocking.”
Instagram said that, between April and June 2019, it removed 834,000 pieces of content, 77% of which had not been reported by users.
But Mr Russell said: “It would be great if they could find a way to take down 10 times the number of posts and really reduce the potentially harmful content that is on their platform.”
Instagram chief executive Adam Mosseri said: “Nothing is more important to me than the safety of the people who use Instagram.
“We aim to strike the difficult balance between allowing people to share their mental health experiences – which can be important for recovery – while also protecting others from being exposed to potentially harmful content.”
Mr Russell urged parents to speak with their children about what they are viewing online and how they are accessing it.