Speaking on the BBC’s Andrew Marr show, Matt Hancock (a British Secretary of State for Health and Social Care since 2018) said: “If we think they need to do things they are refusing to do, then we can and we must legislate.” But he said it would be better to work jointly with social media companies.

The minister earlier called on social media giants to “purge” material promoting self-harm and suicide in the wake of links to a teenager’s suicide.

Asked if social media could be banned, Mr Hancock said: “Ultimately parliament does have that sanction, yes” but added: “it’s not where I’d like to end up.”

Molly Russell, 14, took her own life in 2017 after viewing disturbing content about suicide on social media. Speaking to the BBC, her father said he believed Instagram “helped kill my daughter”.

Mr Russell also criticised the online scrapbook Pinterest, telling the Sunday Times: “Pinterest has a huge amount to answer for.” Instagram responded by saying it works with expert groups who advise them on the “complex and nuanced” issues of mental health and self-harm.

Based on their advice that sharing stories and connecting with others could be helpful for recovery, Instagram said, they “don’t remove certain content”. “Instead (we) offer people looking at, or posting it, support messaging that directs them to groups that can help.”

But Instagram added it is undertaking a full review of its enforcement policies and technologies. A Pinterest spokesman said: “We have a policy against harmful content and take numerous proactive measures to try to prevent it from coming and spreading on our platform.

“But we know we can do more, which is why we’ve been working to update our self-harm policy and enforcement guidelines over the last few months.” Facebook, which owns Instagram, said earlier it was “deeply sorry”.

The internet giant said graphic content which sensationalises self-harm and suicide “has no place on our platform”.



Author avatar