How to stop the damage caused by uncontrolled algorithms

A picture of an injured Ukrainian influencer on social media, staggering from the rubble of a maternity hospital hit by Russian forces, has become a new symbol of the global wars of disinformation. When the shocking image of Marianna Vishegirskaya, known as @gixie_beauty on Instagram went viral, a post on a Russian state-controlled site ironically called “War On Fakes” claimed that Vishegirskaya was actually a “crisis actress” paid to play the role. Using this message as a launching pad, the Russian Embassy in the UK tweeted the link, as well as “#fakenews” allegations, repeatedly. Twitter quickly deleted the tweets, and many Western outlets wrote articles debunking the claim.

Some social media watchers have claimed this as a victory against misinformation. If so, it’s a small project, and it’s certainly not a plan for greater success. To solve the problem of misinformation, we need to think about the system that makes it so easy and profitable to sow inaccuracies at such a low cost. The good news is that there are a number of concrete steps governments and social media companies can take to curb misinformation without reinventing the wheel or killing social media.

The disinformation war is an eternal war

Governments have always used propaganda to serve national and ideological ends. The United States and its allies created news outlets such as Voice of America (VOA) and Radio Free Europe (RFE) to combat Nazi and Soviet propaganda. It was a much more orderly war, with radio towers providing the main broadcast power. Through these efforts, the West has learned one of the main lessons of the fight against propaganda: providing credible and accurate information in an accessible format.

The internet, ubiquitous connectivity and social media have radically changed the rules of information warfare. The internet has made it inexpensive to create seemingly believable content. Global access to the Internet has facilitated the achievement of low-cost mass distribution. Social media companies have changed the way we consume media, applying machine learning and algorithms to alter our information regime and inspire us to increase our interaction with the news. The idea of ​​generating interest by presenting high-end news is not new. “If it bleeds, it leads” has long been an apt description of media priorities.

However, the above three factors can combine to harm society. The production of digital media facilitated the creation of disinformation that sounded like news. It’s made things so easy, in fact, that many non-government companies have sprung up in places like Macedonia to create fake political content only to generate clicks. Internet and social media algorithms have spread misinformation widely and simultaneously amplified our predictable responses, encouraging the creation of new misinformation. Russia and China to have eagerly kissed the new reality of information warfare, even targeting Canadians with relentless anti-American propaganda.

For example, the Wagner Group, a Russian mercenary company, often use misinformation to support unpopular dictatorships in Africa and other regions by targeting disinformation of political opponents with the aim of causing political unrest and even violence. In Mali, it’s part of Wagner business strategy and Russia’s political strategy, which aims to both profit from political conflict and undermine stability in regions strategically important to the United States.

Because disinformation is so inexpensive and so easy to create and spread, the West is now engaged in an unwinnable game of “hitting a mole”. Asking big social media companies to drastically change their business models is not an option; these platforms were designed from The basics of virality. Banning social media is not a solution either. Countless people use social media in ways that benefit them, and society uses social media as a tool for collective action and creative expression.

There are, however, ways to modify algorithms and manage misinformation that could break the grip of misinformation on our information sphere. US efficiencypre-bunking— the release of intelligence to counter Russian disinformation — completely changed the information battlefield against Russia and provided crucial lessons on how to fight and win this broader information war. What the West needs now is a cohesive disinformation warfare strategy and a mechanism that works in partnership with social media companies to restore a healthier information sphere.

Deliberation as architecture of choice

In their delivered, Nudge, Richard Thaler and Cass Sunstein describe a concept of incentivizing actions through a “choice architecture”, which provides users with sufficient freedom but nudges them in the desired direction. In the case of misinformation, the desired outcome is less sharing of factually incorrect content. There are several ways to inject friction into the sharing process to slow it down. Twitter already asks users if they want to share articles they haven’t actually clicked on and read. Facebook could easily inject an “are you sure you would like your mom to read this?” prompts users to stop and think before clicking share. Alternatively, if a user has never interacted with an account from which they are sharing content, social media platforms might ask, “You have never read or retweeted anything from this account. Trust?The overall goal would be to encourage people to stop and think before sharing.

Introduction of costs to bad actors

There are other ways to impose costs on bad actors, the easiest being to remove any content they post from the amplification algorithms. For example, social media companies might have a “three strikes” policy that de-amplifies any account with three reported violations for posting inaccurate content. That doesn’t mean they can’t post or people who follow them directly can’t read their content. But it throws sand in the wheels of virality. And de-amplification will destroy the business models of commercial entities that attract customers through misinformation. A more drastic measure would be to block trading on these sites or accounts, which often profit from the sale of t-shirts and other merchandise. While social media platforms are already responding to complaints about misinformation, giving moderators of low-status, low-paying content full-time jobs, better pay, and more power to impose tougher penalties would go a long way to imposing tougher penalties. misinformation costs.

Pre-stack everything, everywhere

Both suggestions above focus on altering algorithms to reduce misinformation. An equally powerful change would be the establishment of a policy ofpre-bunking.” During the war in Ukraine, the US government controlled the information space in Europe by choosing to publish information to anticipate well-worn Russian disinformation narratives. Simply saying publicly what intelligence agencies are secretly hearing changed the equation, debunking Russian disinformation in the West and establishing a factual information base. In the United States and the West, pre-bunking has deprived Russian intelligence agencies of the ability to cast doubt on facts on the ground and outrun public perception by spreading untruths on social media unchecked and unchecked. without opposition.

It was a quiet preemptive effort, but it should become the norm. Wherever bad actors are spreading disinformation, the United States is likely collecting intelligence on what is going on. Just as the United States spends hundreds of billions on warships and aircraft, it should invest in an information warfare machine focused on the continued prevention of disinformation machines. Compared to the costs of military equipment, the price would be paltry. Losing the secret can present certain risks and should always be a considered choice. But in the age of open-source intelligence, when private satellites provide military-grade photography and information warfare takes place in real time, sharing more and not less with the public will generate goodwill and credibility while doing advance Washington’s geopolitical objectives. For social media algorithms, pre-bunking can serve as a counterbalance to misinformation and ballast to proactively anchor the information space rather than allowing a continuous feedback loop to fake posts and stories.

What technology takes away, technology also gives. The thoughtless adoption of algorithm-based content engines has made it easier than ever to create and spread misinformation, and clever tweaks to those algorithms could yield disproportionately beneficial results. Over time, people may even begin to think differently about how and what they share, and new business models may emerge that are less reliant on blind clicks and raw attention diversion. Misinformation will always be with us – it is part of the world of information sharing. Whether we let misinformation dominate our information sphere and drive the conversation depends on providing better and broader choices and feedback by fixing the algorithms that allowed single players to dominate the information sphere first. location.

Vivek Wadhwa and Alex Salkever are the authors of The driver of the driverless car and From Incremental to Exponential: How Big Companies Can See the Future and Rethink Innovation. Their work explains how advanced technologies can be used for both good and ill, to solve humanity’s grand challenges or to destroy it.

Picture: Wikimedia Commons.

Sharon D. Cole