The Recorder – My Turn: Hate has gone viral — how social media algorithms create inner terror

What would motivate you to attempt to overthrow the power structures of your country, risking your life and well-being? What if you thought every last facet of your society was built to prey on the most vulnerable? Would that be enough?

For a growing portion of Americans, this exact situation is the reality. It’s just not a real reality. In March 2021, 14% of Americans expressed belief in QAnon, the political conspiracy that believes in the existence of a satanic, child-eating, sex-trafficking network that opposes the presidency of Donald Trump. That number rose to 17% in September 2021, according to the Public Religion Research Institute, a nonpartisan research think tank.

Since its birth in 2017, QAnon has evolved to include the belief that the 2020 election results were fraudulent, a belief shared by a much wider range of people – more than 40% of Americans if a January 2022 Axios poll is of be believed. With such a high percentage, it is important to consider how widely this misinformation is being spread. Social networks are one of the main sources.

Regardless of the platform, social media companies like Facebook, Twitter, YouTube, and Instagram all profit from ad revenue. Companies that want to advertise their products bid for every second they spend on social media. This profit motive has led social media companies to devise algorithms that keep users staring at their devices for as long as possible by suggesting content they would find most engaging.

As humans, our brains are incredibly drawn to things that make us feel like we belong or evoke strong emotions. Sadly, out of anything available on the internet, conspiracy content is some of the best for pushing those psychological buttons. And because it’s so engaging, the algorithm encourages its growth by nudging users in its direction.

Suppose a person is on their favorite social media app and is browsing content personalized to them, such as a “For You” or “Suggested” page, when they come across a news item. Say, for example, a headline that claims new information has been uncovered regarding the lab leak theory of the origin of the coronavirus. The user doesn’t read the story to learn that the sensational headline doesn’t match the article’s claim that the virus didn’t come from a lab, they just like the message. The algorithm picks up on this newfound interest and soon the user’s feed is filled with misinformation about the pandemic, vaccine effectiveness, etc. In the comments to these articles, they chat with other users falling down the same rabbit holes and joining increasingly extreme online groups with similar beliefs. Situations like this play out constantly.

On the Internet, there is very little or no favor from trustworthy sources of information. Anything can be liked, picked up by the algorithm as a trend, and sent out to the masses. There is no difference between what is true and what is false, except that the latter is often much more engaging. Fake news outperforms real news on the Internet, and performance is all that matters to the algorithms. It’s a perfect storm to create a bubble of misinformation, where users are willing to be wary of anything that doesn’t agree with their worldview.

Going back to the question posed at the beginning, the belief that action is desperately needed, and the reinforcement of that belief by other users, is quite the motivating factor. People think that what they say and do on the Internet has no consequences in the real world. But it’s wrong. In a pizza parlor in Washington DC, Christchurch, New Zealand, and Buffalo, New York, individuals committed acts of terrorism motivated by beliefs encountered or reinforced online.

The internet has the power to connect us, regardless of nationality, age, language or economic status. But this power is not without consequence. Left unchecked, social media companies have created the greatest decentralized terrorist asset in human history, by accident. If we are to stop the spread of misinformation and hate in this country, social media must be improved.

Zachary Rutherford is a recent graduate of Four Rivers Charter Public School in Greenfield. This piece was originally written as part of a civics class that covered media literacy in the digital age and the impact of social media on society. This is the second of three plays written by Four Rivers students to be released this week.

Sharon D. Cole