Christchurch Call pushes against algorithms
New Zealand is leading a bid to break secret social media algorithms as part of the Christchurch call response, with backing from some big tech companies. But without the contribution of the greats, will it undermine extremism on social media platforms? Media monitoring asks a social media pioneer who founded the ancestor of Twitter, but now wants to break the power of the big platforms.
The Christchurch call went out in Paris two years ago with a whistle and a roar, but the latest developments over the past week haven’t made as many headlines.
This time, Prime Minister Jacinda Ardern and French President Macron have launched an initiative backed by Microsoft and Twitter to research social media algorithms that amplify misinformation.
Ardern also told the United Nations General Assembly that online platforms had become “a weapon of war”.
“What if this lie, repeated over and over and on many platforms, instigates, inspires or incites others to take up arms? Threatening the safety of others. Turn a blind eye to atrocities, or worse, become an accomplice in them. So what ? ” she asked.
“It is no longer a hypothesis. The weapons of war have changed, they are upon us and require the same level of action and activity that we put into the weapons of old.
“I think it’s very difficult for governments to say they’re going to step in and regulate something that’s so misunderstood,” Ardern told reporters later.
But Meta – the owner of Facebook, Instagram and WhatsApp – and TikTok’s Chinese parent company, Bytedance, are not involved in this Christchurch Call project.
And the fast-growing platforms most likely to host misinformation that plays a role in radicalizing people — like Rumble or messaging app Telegram — were never part of the plan in the first place.
Can Christchurch Call’s attempt to crack open algorithms actually make social media safer without compromising fundamental freedoms?
Evan Henshaw-Plath is a social media innovator who helped create the Odeo platform in the mid-2000s, which later morphed into Twitter in 2006.
Twitter has become the leader of micro-blogging and his former colleague Jack Dorsey has become the aristocracy of Silicon Valley.
“It’s good if there’s Nazi propaganda on the Internet. If you want to know what the Nazis thought in the 1930s and 40s, it’s good to read about it. [But] if you want to see a whole bunch of misinformation that is recruiting people, and taking them from one video to another video to the next video to the next video to all of a sudden things that are white supremacist.. .that’s what the algorithm does,” Evan Henshaw-Plath said. Media monitoring.
“It’s not about controlling what information is available. It’s about understanding how engagement works and how people experience it. It’s complicated and it’s easy to get the regulations wrong. But regulation is the only thing that’s going to change the behavior of these companies,” said Henshaw-Plath, who moved from the US to Wellington this year to work on social media platforms that can’t scale. from Twitter, Facebook or TikTok. – or do the kind of harm the huge rigs are now attacking.
These algorithms are the private property of some of the biggest companies in the world. Will researching Christchurch Call help?
“They are considered trade secrets, but they don’t have to be. There’s a bunch of movements – including the Algorithmic Justice League, where people are – we need to be able to see what those algorithms are and how they’re designed. And we need to be able to compare them,” Henshaw-Plath said. Media monitoring.
Users of its own service, Planetarycan choose the algorithms they want to use – and modify them.
“If you’re a programmer you can actually see how we implement the algorithms – and contribute to it so that they get algorithms for what they want and not what the platforms want.
Henshaw-Plath said it is in the interests of major platforms to understand the impact of their algorithms.
“Facebook has invested a lot of funds, ironically enough, in privacy research. At Planetary, we use a bunch of research through Oxford (University) that was funded by Facebook.”
People laughed at Mark Zuckerberg’s youthful way of launching the Metaverse concept. But if this much more immersive and intense online experience is as popular as Facebook, will that further amplify the impact of bad things?
“We are always worried about emerging media. You can find essays on how dangerous the novel was because it kept people inside. And the same with the telephone and the radio and the television and the newspapers and the web and the electronic mail and now. . . social media,” Henshaw-Plath said.
“Hitler, Stalin and Mussolini could not come to power without the radio. We couldn’t have this kind of unified, totalitarian authoritarian state without radio.
“But it’s important to remember that, for example, trans people were all in the closet 10 years ago, but they were able to see themselves represented on social media, whereas the media before that never gave them the speech.
“And then once they saw themselves represented, they were able to come out of the closet.
“But what we have now is fundamentally different. With the Metaverse, augmented reality and virtual reality, an immersive world will transform different forums – but we do not know where.
“What we need to do is look at what spaces and what activities encourage this type of authoritarian and strong leadership behavior; encourage this kind of strong behavior in-group and out-of-group, especially (in) alienated young men who don’t feel like they have a community.
“For me, it’s a very exciting time when we see something like Facebook finally die – but Instagram, WhatsApp, Oculus (also owned by Meta) survive.
“When these dominant platforms disappear, there is room for new things to be created. And we can decide what those things are and what the values underpin them, and what kind of society we want them to help us move forward.”
This week, Facebook’s parent company, Meta, hosted an online briefing for New Zealand journalists with Andy O’Connell, head of policy and strategy at Meta.
He was briefly in New Zealand after meeting with South Korea’s communications regulator in Seoul last week.
This was a “context only” briefing, to tell us how Meta is already grading, moderating and monitoring content – and the anti-extremism initiatives they support around the world and here in New Zealand to that end, as a code of conduct recently published by Netsafe.
Journalists have also been told that Meta is also subjecting itself to scrutiny, for example by externally auditing reports on Meta’s own application of Community Standards (effectively the codes of conduct for users of its own platforms).
Mediawatch has requested a taped interview with O’Connell about Meta’s algorithmic transparency – and the Christchurch appeal. No luck so far.