New Facebook Vice-President, Sir Nick Clegg, has told the BBC the firm will do “whatever it takes” to make its social media platforms safer for young people. Sir Nick was responding to the case of 14-year-old Molly Russell, who took her own life after viewing distressing self-harm images on Instagram.

He added that some experts say it is wise to keep certain images up because they can also help people find support. He admitted he would not let his own children view some graphic examples.

Molly Russell took her own life in 2017. When her family looked into her Instagram account they found distressing material about depression and suicide. Her father says he believes Instagram is partly responsible for his daughter’s death.

“I can tell you firstly we’re going to look at this from top to bottom, change everything we’re doing if necessary, to get it right,” Sir Nick said. “We’re already taking steps soon to blur images, block a number of hashtags that have come to light, and thirdly to continue to work… with the Samaritans and other organisations.”

However he added that the advice of these experts is not to ban all content of this nature. “… I know this sounds counter-intuitive, but they do say that in some instances it’s better to keep some of the distressing images up if that helps people make a cry for help and then get the support they need,” he said.

The former UK Deputy Prime Minister also talked about tax, saying Facebook should pay more outside the US, at his first public appearance as the Tech giant’s head of communications. It was “unbalanced” that most of Facebook’s $4bn [£3bn] tax bill was paid in the US “even though the vast majority of Facebook’s users are outside the United States”, he said.

“That is what needs to change,” Sir Nick said, adding the onus was on governments to come up with “a better way to tax companies like Facebook”.

Following his interview with BBC media editor, Amol Rajan, Sir Nick gave his first public speech since his surprise appointment, announcing the creation of an external body to help Facebook users challenge decisions made about flagged content.

He said he supported Tech sector regulation and agreed with Facebook founder Mark Zuckerberg’s view that “Facebook should not make so many important decisions about free expression and safety on its own”.

In the wide-ranging speech, Sir Nick also acknowledged the platform had a responsibility to limit the potential for political damage caused by the spread of fake news. But he defended Facebook’s business model – of using personal data to sell targeted advertising, rather than charging users a subscription, which some would be unable to afford.

The collection and sharing of personal data, such as users’ location, shopping habits and holiday plans, was now “routine” among many private companies and public sector organisations, Sir Nick said. “The data-driven economy is here to stay and we have to find ways of managing its harms while preserving its benefits,” he said.

“It is, for better or worse, how the internet works”. Underlying a lot of the criticisms of Facebook’s business model is the assumption that advertising is inherently exploitative. “I don’t share that view. There is such a thing as responsible advertising.” Sir Nick added he had seen “many changes” at Facebook since he joined in October 2018. There was surprise at his appointment, especially as he had written negatively about the social network in the past. “As the internet has evolved, so have the norms and standards that apply to it,” he said in his speech.








Author avatar