Britain could ban social media companies that fail to remove harmful material, health secretary Matt Hancock has said.
The politician wrote to Twitter, Snapchat, Pinterest, Apple, Google and Facebook after the death of Molly Russell, a 14-year-old-girl who had been viewing material online linked to depression, self-harm and suicide.
“We can legislate if we need to,” he said, when asked about websites where content promoting self-harm and suicide can be found.
Join Independent Minds
For exclusive articles, events and an advertising-free read for just
Get the best of The Independent
With an Independent Minds subscription for just
“It would be far better to do it in concert with the social media companies but if we think they need to do things that they are refusing to do then we can and we must legislate.”
The health secretary was asked, during his appearance on BBC’s Andrew Marr Show, if the UK would go as far as banning or imposing extra taxes on websites that failed to remove harmful content.
“Ultimately parliament does have that sanction, yes,” he said.
“It’s not where I’d like to end up, in terms of banning them, of course, because there’s a great positive to social media too.
“But we run our country through parliament and we will and we must act if we have to.”
Mr Hancock’s intervention came after Molly Russell’s father said that Instagram “helped kill” his daughter.
The teenager was found dead in November 2017 and an inquest into her death is expected later this year.
In his letter to the social media firms the health secretary said that he felt ”desperately concerned to ensure young people are protected”.
“I welcome that you have already taken important steps, and developed some capabilities to remove harmful content. But I know you will agree that more action is urgently needed.
“It is appalling how easy it still is to access this content online and I am in no doubt about the harm this material can cause, especially for young people.
“It is time for internet and social media providers to step up and purge this content once and for all.”
The health secretary added that the government is preparing a white paper examining “online harms”, which will consider content on sucide and self-harm.
“I want to work with internet and social media providers to ensure the action is as effective as possible,” Mr Hancock said.
“However, let me be clear that we will introduce new legislation where needed.”
The Samaritans praised the health secretary for “taking a positive step in contacting the social media companies and talking about ways social media platforms could be doing more to protect their users from harmful content”.
“While there are lots of positive peer support communities on social channels, we need to maximise opportunities to see positive content and minimise potential for seeing more harmful content,” a spokesperson for the charity said.
“Tech companies could do more to show users how to flag dangerous content and should work together to remove dangerous imagery across their platforms. There is also a need for more research to into this issue, which should by carried out by these companies.”
However, the spokesperson added the charity knows “that for lots of people, social media and the online environment more generally provides them with an important space to share their feelings and receive support.
“It is really important that this continues, and we’d like companies to do more to ensure that people searching for self-harm and suicide content are able to access supportive content more easily.”
It is understood that Instagram is taking steps to reduce the amount of harmful content on the platform and will inform Mr Hancock of its plans.
The site’s approach will include blocking content from appearing on hashtag searches, where the hashtag is being used to share significant amounts of harmful material.
“Our thoughts go out to Molly’s family and anyone dealing with the issues raised in this report,” a spokesperson for Instagram said.
“Nothing is more important to us than the safety of the people in our community, and we work with experts everyday to best understand the ways to keep them safe.
“We do not allow content that promotes or encourages eating disorders, self-harm or suicide and use technology to find and remove it.
“Mental health and self-harm are complex and nuanced issues, and we work with expert groups who advise us on our approach.
“They tell us that the sharing of a person’s mental health struggle or connecting with others who have battled similar issues, can be an important part of recovery.
“This is why we don’t remove certain content and instead offer people looking at, or posting it, support messaging that directs them to groups that can help.
“We are undertaking a full review of our enforcement policies and technologies around self-harm, suicide and eating disorders. As part of this, we are consulting further with mental health bodies and academics to understand what more we can do to protect and support our community, especially young people.
“While we undertake this review, we are taking measures aimed at preventing people from finding self-harm related content through search and hashtags.”
Additional reporting by agencies