Explained: How profit-driven algorithms disconnect the public from real news

Currently, algorithms that select and adapt content based on an audience member’s social context, personal preferences, and a wealth of digital trace data

New Delhi: In the age of social media misinformation, what evidence reaches what parts of the audience increasingly rely on automated algorithms curated by social media platforms rather than scientists, journalists, or users of the platforms themselves themselves, scientists warn.

This competition for public attention has produced at least three urgent lessons that the scientific community must face as online information environments rapidly replace traditional mainstream media, according to a notice published Thursday evening in the journal Science. .

“The rules of scientific discourse and the systematic, objective, and transparent evaluation of evidence are fundamentally at odds with the realities of debates in most online spaces,” wrote Dominique Brossard and Dietram Scheufele of the University of Wisconsin-Madison. .

“It is debatable whether social media platforms designed to monetize outrage and disagreement among users are the most productive channel for convincing skeptical audiences that the established science on climate change or vaccines is not to be found. debate,” he added.

Micro-targeted information increasingly dominates social media, organized and prioritized algorithmically based on audience demographics, an abundance of digital trace data and other consumer information.

Partly as a result, hyper-polarized public attitudes on issues such as Covid-19 vaccines or climate change are emerging and growing in separate echo chambers.

“Unfortunately, social science research suggests that rapidly changing online information ecologies are likely to be unresponsive to scientists uploading content – ​​no matter how appealing – to TikTok or YouTube,” they said. the scientists.

We have recently seen a bombardment of fake news on serious topics like Covid-19, elections, religion, etc., and social media platforms have failed to curb the spread of misinformation as billion people connect to various platforms.

According to the opinion piece, the unscientific nature of the use of anecdotal data or scientific authority figures is partly driven by 280-character constraints on platforms like Twitter and partly by generations of programming. science communication training urging scientists to tell more engaging stories.

Unfortunately, this arms race over the most effective narratives has its risks.

“Decades of communications research indicate that anecdotal social media accounts of breakthrough severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infections or serious adverse reactions to Covid-19 vaccines, regardless of their rarity, will be imprinted in people’s lives far more effective memories than pages of solid statistical data documenting herd immunity,” the scientists lamented.

Currently, algorithms that select and adapt content based on social context, personal preferences, and a wealth of digital trace data of an audience member are increasingly determining what scientific information an individual is likely to receive in Google searches, Facebook feeds and Netflix recommendations.

“For audiences less engaged with credible science content, artificial intelligence, if left unchecked, could potentially slow the flow of reliable Covid-19 information to a trickle, drowning it with a surplus online noise”.

At present, science can do little to escape this dilemma.

The same profit-driven algorithmic tools that bring curious, science-friendly followers to scientists’ Twitter feeds and YouTube channels will increasingly disconnect scientists from the audiences they urgently need to connect with.

“Moving forward, addressing this challenge will require partnerships between the scientific community, social media platforms and democratic institutions,” the scientists said.

For all the latest News, Views & ViewsTo download ummid.com App.

To select Language To read in Urdu, Hindi, Marathi or Arabic.


Sharon D. Cole