On February 24, as Russian forces began their invasion, stories of US-funded bio-labs and bio-weapons research in Ukraine started to spread on social networks.
the false statements spread from right-wing circles but spread and was quickly picked up by Fox News host Tucker Carlson. It was not long before the Russian government, which had spread stories of Ukrainian biolabs in the pastadopted the narrative as a belated justification for the invasion.
We investigated how the biolabs narrative was amplified on Twitter and made a disturbing (if not quite surprising) discovery. Most of those responsible for sending the story viral tried to debunk it, but only ended up giving it more oxygen.
Debunking gone wrong
We first set out to seek out coordinated groups of conspiracy theorists promoting the bioweapons theory.
To do this, we searched for Twitter accounts that retweeted posts mentioning both Ukraine and biolabs. Then, to see how these accounts were connected to each other, we looked to see if two accounts had retweeted the same thing at the same time. We found 1,469 such accounts and 26,850 links between them.
In the visualization of our results below, each dot is an account that retweeted at least one post about Ukrainian biolabs. When two accounts retweeted the same thing within a minute of each other repeatedly, we draw a line between them.
You can see that accounts are divided into groups of coordinated retweeting behaviors. We found 50 such clusters, and 49 of them were trying to debunk the bioweapons theory. Only a small group (circled in blue) tried to spread it.
Within other major groups in this network, we’ve seen tweets from accounts working to debunk the bioweapons conspiracy, such as White House press secretary Jen Psaki, the Pentagon, the Kyiv Independent, and Sky News.
Our analysis concludes that those who did the most to spread the narrative were those who attempted to debunk it. Most clusters retweet Psaki (right dotted circle).
Disinformation for all
One place to begin to understand what is going on is with American researcher Kate Starbird’s idea of ”participatory misinformation“.
This process often begins with highly visible users (like politicians, celebrities, or opinion leaders) disseminating information to their online audience.
However, for the biolabs conspiracy theory, the narrative began on alt-tech platform Gab and gained traction on Twitter through the efforts of a fringe account QAnon. But as the discussion continued on Twitter, the theory was picked up by the Chinese and Russian foreign ministries, culminating in a segment on Fox News show Tucker Carlson Tonight.
This is how a conspiracy theory becomes “news”. Audiences filter news through their own view of the world, which is already influenced by the media with which they interact regularly. Audiences construct, modify and promote these interpretations in their own social networks.
“Grassroots” participants pick up on the misinformation circulating in their communities, augment and spread it; the process repeats itself in a self-perpetuating feedback loop.
When political actors such as Psaki or Russian government officials tweet about a conspiracy theory, it doesn’t matter if they try to dispel it or reinforce it: they end up give him oxygen.
The new conspiracy
If working to debunk false narratives only extends the feedback loop, what else can be done?
Participatory disinformation cycles have helped us to a crisis over how we as social groups make sense of the world.
American political scientists Russel Muirhead and Nancy L. Rosenblum call the outcome of this crisis “new conspiracy“.
Where old-style conspiratorial thinking relied on complex theories to back up its claims, for newer conspirators an idea may be true simply because it garners a lot of attention.
The spread of new conspiracies intensified with the erosion of trust in traditional institutions over the past decades.
Donald Trump and other politicians around the world have worked to accelerate this erosion, but they are only part of the problem.
A bigger part is that misinformation is lucrative for social media companiesand social media is integral to how we socialize and form our opinions.
What can be done?
Time and time again, we have witnessed conspiracy theories spread on social media, contributing to political polarization and undermining democratic authority.
It is time to rethink our media ecosystem and the way we regulate it, before trust in democratic institutions and principles declines further.
Tackling this problem is a herculean task and it is not enough for countries to individually legislate and regulate platforms. It must be a global effort. Financial penalties are no longer enough – there needs to be systemic change that discourages platforms that profit from misinformation and disinformation.
Similarly, politicians and communicators such as Psaki need to be better informed that giving vent to these conspiracy theories can have unintended effects; attempts to educate or demystify can result in global amplification.
For regular social media users, the advice, as always, is to think twice before sharing or retweeting.
When a piece of content evokes a strong emotional reaction, it can often be a sign that misinformation is at play. If you really want to share something, taking a screenshot of the content is better than further amplifying the source because it cuts the disinformator out of the chain.
/ Courtesy of The Conversation. This material from the original organization/authors may be ad hoc in nature, edited for clarity, style and length. The views and opinions expressed are those of the authors.