The father of Molly Russell, the teenager who took her own life after viewing disturbing material online, has urged social networks to take action immediately and not drag their feet until a regulator is introduced.
Ian Russell, who now runs the Molly Rose Foundation in memory of his daughter, spoke alongside the NSPCC in calling for swift changes within tech giants, six weeks after the Government unveiled its online harms White Paper, designed to make companies more accountable for the content on their platforms.
This Tuesday will mark 18 months since Molly’s death, and her family believe harmful content on social media was a contributory factor after finding material relating to depression and suicide on her accounts.
“The clock does keep ticking,” Mr Russell warned.
“Harmful content on Instagram still comes and goes, sometimes hidden behind privacy screens that blur the blood and the scars, and sometimes not.
“It’s still all too easy to find such dangerous content and, if you look beyond Instagram to other Facebook platforms, and further to a host of other platforms large and small, online harms are still available – and of course it’s far more than the self-harm content that affected Molly.
“It’s a much bigger problem than the problem Instagram promised to remove.”
Mr Russell said social networks should not use the time taken by the Government to decide on how to regulate as “a mechanism for delaying change”, adding that “profit must play a secondary role to positive action” and “success must be seen in terms of suicide reduction and support for those that desperately need it”.
“The tech companies have failed to regulate; they now call, in fact, for the Government’s help in sorting out their problems by providing regulation, but this call must not be a mechanism for delaying change,” he continued.
“From Silicon Valley to Select Committees, let us debate the form of new legislation, legislation that will take the place of, in effect, self-regulation, but, while the years pass that will allow the needed debate that will allow this legislation to be balanced, proportional and effective, we must make changes now.”
The NSPCC, which has long campaigned for greater online safety for children through its Wild West Web initiative, supported Mr Russell’s calls for urgency, telling social networks to “hard wire” safeguards for children into how they design and deliver their services.
The charity wants the Government to monitor how social networks respond to an interim code of practice and name and shame those who fail to comply.
“It will likely be a couple of years before we see legislation and then a regulator making its first decisions – two years is a long time when we know that children are being put in harm’s way today because platforms are continuing to not do everything they can to keep children safe,” said Andy Burrows, NSPCC associate head of child safety online.
“We know that in the White Paper the Government has signalled that they will be drawing up a couple of codes of practice on an interim basis – one around child sexual abuse and exploitation, also another one around terrorist content.
“Until we have the legislation passed there isn’t the regulatory powers, but what the Government can and should do is really closely monitor and enforce whether platforms are playing ball with this interim code of practice.
“The Government should be ready to name and shame those platforms that are dragging their heels, that are refusing to take action.
“it’s very clear that regulation is coming but we shouldn’t wait until the day that it becomes a legal requirement for platforms to start to comply.”
Health Secretary Matt Hancock recently revealed that social media giants have pledged hundreds of thousands of pounds to the Samaritans charity in a bid to rid the internet of self-harm videos and other damaging material.