World / Technology

After the Christchurch attacks, Twitter made a deal with Jacinda Ardern over violent content. Elon Musk changed everything

16:05 pm on 28 April 2024

By Emily Clark for the ABC

Jacinda Ardern founded the Christchurch Call after a terrorist livestreamed his shooting rampage at two Canterbury mosques. Elon Musk took over Twitter in 2022. Photo: RNZ / AFP

After the worst terrorist attack in New Zealand's history was live-streamed, then-prime minister Jacinda Ardern made it her mission to eliminate violent and extremist material from the internet.

Now the organisation she founded - the Christchurch Call - has confirmed to the ABC it is in contact with the Australian government over its legal battle with X, formerly known as Twitter.

Australia's eSafety commissioner Julie Inman Grant and social media platform X - owned by Elon Musk - are in a legal dispute over footage of an alleged terrorist attack in Sydney's west that was posted to the platform.

Last week, just a day after six people died in a stabbing attack in Westfield Bondi Junction, Bishop Mar Mari Emmanuel was stabbed during a service at Christ The Good Shepherd Church in Western Sydney.

The service had been live-streamed and so when authorities declared they were treating the attack as an alleged act of terrorism, versions of that video that were circulating online became class 1 material, or material of abhorrent violent conduct.

The law requires social media companies to take all reasonable steps to remove that material from their platforms.

Inman Grant issued a take-down order, and while X geoblocked the material in Australia, the platform says it is an overreach for the videos to be removed worldwide.

But the eSafety commissioner's position is that Australians can still access the material with a VPN, so the videos need to be removed.

Bishop Mar Mari Emmanuel was stabbed during a church service, which was being live-streamed. Photo: Screenshot / YouTube

It has been more than five years since the March 15, 2019 Christchurch mosque attacks in which a gunman killed 51 people during Friday prayers.

The Christchurch massacre was live-streamed and watched by more than 4,000 people before the feed was shut down by social media platforms.

At the time, Ardern wrote in the New York Times that the attack was "designed to be broadcast on the internet" and issued a call to action to social media platforms to be proactive in removing offending material.

The platform X is still listed as a member of the Christchurch Call, and when he took ownership, Musk reportedly confirmed to French President Emmanuel Macron he would uphold its commitments, which include the removal of terrorist and violent material.

Despite stepping down as prime minister, Ardern was last year appointed as special envoy to the Christchurch Call to continue its work.

In a statement to the ABC, the secretariat said it had been in contact with the Australian government after the "murders at Bondi Westfield and the [alleged] terrorist attack at the Assyrian Orthodox Christian Church in Wakeley".

"The Christchurch Call secretariat contacted Australian officials early on Tuesday morning to offer our support," it said.

"We remain in contact and are discussing with Australian officials and our civil society advisory network what advice and practical support the Call community can provide."

The church stabbing came just a week after six people were murdered by a knifeman at Westfield Bondi Junction. Photo: DAVID GRAY / AFP

The statement said the organisation was "deeply concerned about this incident and its impacts, both directly and through the dissemination of content".

"The Christchurch Call continues to work towards eliminating terrorist and violent content online. Significant progress has been made with improving online safety," the statement read.

"But as demonstrated by this and similar events, this is an ongoing global challenge. There remains work to do."

The aftermath of 15 March

The terrorist who carried out the Christchurch attacks wore a GoPro camera and live-streamed the event, and while that feed was cut by social media platforms, the videos continued to circulate.

Ardern wrote in the New York Times: "The entire event was live-streamed - for 16 minutes and 55 seconds - by the terrorist on social media. Original footage of the live stream was viewed some 4,000 times before being removed from Facebook.

"Within the first 24 hours, 1.5 million copies of the video had been taken down from the platform. There was one upload per second to YouTube in the first 24 hours."

The Christchurch attacks were seen globally as the moment when governments were no longer willing to let platforms regulate themselves on the issue of extremist material.

Jacinda Ardern with French President Emmanuel Macron. Photo: AFP / Pool / Charles Platiau

Two months later on 15 May, 2019, Ardern and Macron along with 10 heads of state and leaders of global technology companies met in Paris and launched the Christchurch Call to Action.

It was a coming together of governments, technology companies and groups within civil society to tackle the spread of violent and extreme material online.

Senior research associate in media, technology and regulation at the University of New South Wales Rob Nicolls said Christchurch "set off a thinking pattern in e-safety policy".

"The approach to abhorrent graphical violence and then the Online Safety Act in Australia - under which the eSafety commissioner issued the removal notice - those were effectively the government response to the Christchurch Call," he said.

When then-communications minister Paul Fletcher introduced the bill, he said one of the key recommendations ahead of drafting the legislation was "to deny terrorists the ability to spread their propaganda and to incite further violence and acts of hate".

On the United States Studies Centre's Technology and Security podcast, then-head of public affairs for Twitter Kara Hinesley recently recalled the formation of the Christchurch Call and what the Twitter of 2019 committed to.

"On the tech side, there were some really key principles that were pulled together. I remember having to fly to San Francisco. We had a meeting actually at Twitter's headquarters on Market Street, and we had all the different general counsels and chief legal officers and public policy folks from all the companies that signed on," Hinesley said.

"And we came together with prime minister Ardern's team and we started talking through what this could look like."

Facebook's parent company Meta is among the companies signed up to the Christchurch Call. Photo: AFP

Hinesley said the group asked how the technology companies could establish a "content incident protocol".

"So that whenever there is an incident happening, but it had an online aspect to it like the Christchurch attacks, that we would be able to quickly not only let all the other companies know, but also start to put all of these hashes into this database," she said.

"And be able to eradicate that content that had been verified as terrorist or violent extremist content, so illegal in the jurisdictions that we operate within and be able to take that down en masse."

Now, more than 50 countries have signed up to the Christchurch Call, including the US, Britain, Germany and South Korea.

Tech companies signed up to its commitments include Facebook's parent company Meta, Amazon, Google, Microsoft, YouTube, Zoom and Twitter, now known as X.

Those signatory companies pledge to: "Take transparent, specific measures seeking to prevent the upload of terrorist and violent extremist content and to prevent its dissemination on social media and similar content-sharing services, including its immediate and permanent removal, without prejudice to law enforcement and user appeals requirements, in a manner consistent with human rights and fundamental freedoms."

But the Twitter that originally met with Ardern and that signed up to both the Christchurch Call and Australia's voluntary code of practice is very different to the company we now know as X.

Elon Musk takes over Twitter

By the time Musk eventually took over Twitter in October 2022, staff knew to expect changes.

Elon Musk took over Twitter, now called X, in 2022. Photo: Jaap Arriens / NurPhoto via AFP

But as several told the ABC, it was the speed and breadth of changes that took many of them by surprise.

Of the 7500 people who worked at Twitter, Musk laid off about half of them, including experts in content moderation and the head of legal, policy and trust Vijaya Gadde.

By early November, Australian-based staff had been locked out of the Twitter systems.

The team who had made commitments under the Christchurch Call - including founder Jack Dorsey - was no longer in the building.

At a summit on national security and disinformation in November 2022, Ardern said the relationship with X was now in "unknown territory".

By December, Macron had spoken with Musk and tweeted that he had confirmed X would continue to participate in the initiative.

But since then the company has moved away from its cohort of social media platforms.

Nicolls said X "was trying to distinguish itself".

"Partly driven by Elon Musk's sort of free speech absolutism, and partly driven by the fact that X is not actually well set up for operations in countries outside of the US," he said.

How other platforms approach regulation

Nicolls said the case that returns to the High Court on 10 May is a "test case" for some of the broader ambitions of e-safety legislation across the world.

"It's a test case because many countries have a take-down notice provisions like the ones here. The US doesn't, but many European countries, many Asian countries [do]," he said.

Information from Australia's eSafety commissioner shows that after the alleged terrorist attack at Wakeley, its office worked with Google, Microsoft, Snap and TikTok to remove the offending material.

Then on 16 April, eSafety "issued Class 1 removal notices to Meta and X Corp, formally seeking removal of this material from their platforms".

"In the case of Meta, eSafety was satisfied with its compliance because Meta quickly removed the material identified in the notice," the office's release said.

"In the case of X Corp, eSafety was not satisfied the actions it took constituted compliance with the removal notice and sought an interim injunction from the Federal Court."

Nicolls said Meta's cooperation with the eSafety commissioner ultimately had a bottom line.

"There's a real expectation that well, of course, if you're going to operate in a country, you need to have build relationships with the regulator and the government in that country because that's part of doing business," he said.

"Mark Zuckerberg is not a free speech shrinking violet, but Meta has worked out that well actually we do business in Australia ... $1.6 billion worth of business, so why would you take a view that says 'I'll threaten that revenue stream'?"

- This story was first published by the ABC