Facebook has announced curbs on its streaming feature ahead of an online extremism summit in Paris, called after the Christchurch mosque attacks. The tech giant said there would be a “one-strike policy” banning those who violate new Facebook Live rules. New Zealand Prime Minister Jacinda Ardern called the measures a “good first step”. In March the gunman live-streamed the attacks in New Zealand, where 51 people died.
Ms Ardern is chairing the summit with French President Emmanuel Macron. It aims to co-ordinate international efforts to stop social media being used to organise and promote terrorism.
Political leaders from Europe, Canada and the Middle East will meet senior representatives from companies such as Facebook, Google and Twitter, who will issue a joint “call to action” to cooperate on “transparent, specific measures” to eliminate terrorist material.
“The dissemination of such content online has adverse impacts on the human rights of the victims, on our collective security and on people all over the world,” reads the pledge. UK Prime Minister Theresa May will call for governments and technology companies to work together to prevent terrorist material being shared online.
Mrs May said the fact Facebook had to remove 1.5 million copies of the video was “a stark reminder that we need to do more”. Speaking ahead of the summit, the prime minister said the tactic of live-streaming attacks “exposed gaps in our response and the need to keep pace with rapidly changing technological developments”.
She said: “My message to governments and internet companies in Paris will be that we must work together and harness our combined technical abilities to stop any sharing of hateful content of this kind.” In a statement, Facebook said that anyone sharing “violating content” like a statement from a terrorist group without context would be blocked from using Facebook Live for a set period, such as 30 days.
The company will also extend these new restrictions to other areas of the platform in the coming weeks, including to advertisers. Facebook has also pledged $7.5m (£5.8m) towards new research partnerships to automatically detect banned content, after some users bypassed existing detection systems by uploading edited versions of the Christchurch attacks.
“Our goal is to minimize risk of abuse on Live while enabling people to use Live in a positive way every day,” the statement said. In the wake of the attack Facebook faced heavy criticism for its lack of response to officials. At the time, New Zealand’s privacy commissioner wrote emails to company executives saying their silence was “an insult to our grief”, the New Zealand Herald reported.
In her speech, Mrs May is expected to raise concerns about the threat of far-right political groups online and call for an international approach to regulation. She is due to say that technology companies responded effectively to her call to fight propaganda from the Islamic State group, after the 2017 attacks at Westminster Bridge, Manchester Arena and London Bridge.
Mrs May will say: “That shows us what is possible. Our work here must continue in order to keep pace with the threat. But we also need to confront the rise of the far right online.” The UK has recently published its own plans to introduce a legal duty of care for internet companies, which would be enforced by a new independent regulator.