The pitfall of social media algorithms

Ten years ago, it was fashionable to talk about the Facebook social media platform as if it were a country. Commentators have measured its active user base against the giant and growing populations of China and India to give an idea of ​​its size, reach and power. Back then, Facebook had around 850 million users, now it’s closer to 3 billion. China and India, by comparison, are now home to about 1.4 billion people each.

With such large and sudden “population” explosions on social media, we became concerned about what these platforms were doing with all the data they were harvesting. Despite these serious concerns, social media was also seen as a positive agent of change. The role Facebook, Twitter and others played in the 2011 Arab uprisings in bringing young people together was consistently cited as an example of the powerful catalyst these platforms could be.

At that time, Facebook’s login page featured the slogan “it’s free and always will be.” The phrase was key in explaining the popularity of the platform. The tradeoff, such as it was, was somewhere between being able to use a dazzling app at no cost and handing over some personal data to big tech.

Active users on all social media platforms in 2022 are approaching 5 billion people, now equaling the entire Asian population, and the years since have transformed the so-called digital city square in a difficult space. The quid pro quo seems more inequitable than ever.

Part of this is because the combination of social media and smartphones has made us low-key, addictive, and private beings. It is possible to spend hours on these platforms with an algorithm serving you an endless stream of machine-generated content.

Online giant Amazon’s recommendation algorithm began outperforming human editors’ choices a long time ago because it used a filtering system based on product links rather than customers. He learned that if you were looking for a copy of, say, F Scott Fitzgerald Gatsby the magnificent, there was a good chance that you were also interested in the works of Ernest Hemingway. He didn’t need to understand why customers might be interested in both authors, he just needed to know that there was a correlation between the products of these authors.

It has become too easy to be caught in an echo chamber that reinforces our own prejudices and to disappear through dangerous traps.

Likewise, social media doesn’t have to question why I’m looking for and interested in content on a particular topic, it just has to fulfill that need, whether it’s good for me or not.

Some people thought that social media would allow us all to be exposed to a wide range of opinions, but it has become far too easy to be caught in an echo chamber, consuming, liking and commenting on posts that reinforce our own prejudices. and disappear through dangerous trapdoors with no easy exit. The separation between our digital personalities and the real one seems to widen every year.

As user bases continue to grow on social media platforms – and TikTok now captures large swathes of the attention economy – the inner workings of the technology need to be redesigned and rebuilt.

A year ago, Frances Haugen, a former Facebook product manager turned whistleblower, told a US Congressional hearing that the platform’s algorithms promote posts with high levels of engagement, often pushing harmful content towards users. Facebook strenuously denied the accusation. Founder Mark Zuckerberg said claims of prioritizing profit over wellbeing were “simply not true”. Ms Haugen has since joined the new group, Council for Responsible Social Media, which launched this week and is pushing for urgent change.

Last month, an inquest in the UK concluded that teenager Molly Russell’s death in November 2017 occurred after exposure to the negative effects of online content. The inquest heard how Molly, who was 14 when she died, saved, shared or liked 16,300 Instagram posts in the six months before her death. Of these, 2,100 were related to depression, self-harm or suicide. His case is likely to provide the springboard for action in the UK.

Ian Russell, father of Molly Russell, speaks to the media, after it was found that his daughter, schoolgirl Molly Russell, died after suffering from

Any proposed legislation will require careful calculation and application. Too strict and it becomes impractical, too lenient and it’s toothless.

So how do you realistically govern a digital territory now inhabited by almost five billion people?

Beyond general actions that prefer steady hands and good faith, a prescription for change requires the following.

The platforms themselves need to further develop their internal audit and monitoring procedures. Self-regulation may seem like a terrible oxymoron, but it can also be the best way forward as long as it champions prevention and solutions.

Platforms need to be more transparent and accountable to users about why content is posted and quicker to deal with inappropriate or harmful posts.

Algorithms need to be rebuilt and machine learning needs to understand why as well as what interests its users.

And finally, safeguard legislation should be written with purpose rather than symbolism.

Posted: October 14, 2022, 4:00 AM

Sharon D. Cole