Algorithms | Columnist |

So there’s this thing called situational awareness. It has three main components. Perception of data and elements of the environment. Then understanding the meaning and significance of the situation. Then projection of future states and events.

Situational awareness is generally important for everything. I find it useful, especially in emergency medicine, to know the ward, patient, and staff factors at any given time and to predict what is likely to happen next and what the trajectory will be. You observe information and data, analyze it, understand it, and then predict the trajectory ahead.

I can also use situational awareness to understand the ideological divide in the world right now. First, as humans, we created social media. Then there were too many platforms and too much information to digest at once. So, we created algorithms. Most platforms have now migrated to an algorithm-based feed. Social media algorithms analyze user behavior and prioritize content that the platform thinks the user wants to see and is most likely to engage with.

When you first join a social media platform, the feed will initially be a random shuffle, but after a short while the feed will start to match your interests more. The intention is that the more things you like you are fed, the more engaged you will stay on the social media platform. And the longer you stay on the platform, the better they are able to find your likes and dislikes and the longer you will then have to stay because more and more, there are things aligned with your interests. Just a rabbit hole.

So, for the Facebook algorithm, they use your reactions to posts, specifically likes and dislikes. They use dwell time, the time you dwell on a message, video, or images. They also predict and narrow down the friends whose streams you want to see first. Similarly for Instagram, the algorithm checks your interests in particular posts, which you follow, which you tag and which you message.

For Twitter, the algorithm focuses on your timeline, which is the stream of tweets from users you follow. They use the “While you’re away” and “Show me the best tweets first” features to identify the people you want more on your timeline. The same goes for TikTok, Netflix, Apple TV, other streaming platforms, and even your internet browser. The algorithm-based feed determines the content you are exposed to as well as the advertisements you view.

The end result is mainly threefold. First, you get addicted. The longer you stay, the more rabbit holes present themselves that you can go and get lost in. You know how it goes. If you allow yourself, you can stay on Netflix, Facebook, IG, TikTok or Twitter for hours. The longer you stay, the more interesting it becomes, and it’s hard to wake up and do something else. The second effect is like House of Mirrors.

Everything you see in your feed or what is recommended is only a reflection of you, your friends, your interests, your group, your clique, your tribe. It is important to have interests, friends, cliques and tribes, but you are not exposed to anything else. So, for example, if you like Creole food, the mirror will give you more and more Creole food, which is great. But since you may not have been exposed to a wide variety of foods, there are foods, things, and ideas that you don’t know if you like them or not. Like kangaroo testicles, for example. But since the algorithm has already put you in the box, down the rabbit hole, Creole food is all you would ever know.

The third effect is that you are so boxed in that you cannot see or appreciate what other people like or see. What you or your clique think and do is considered normal reality. And so everyone has an increasingly definite, narrow and bounded idea of ​​what is the right thing and the normal reality. Cue in for example tomorrow’s midterm elections for the United States. We have never been so divided in our ideologies but also in our misunderstanding of the ideology of other groups. More importantly, our lack of the NEED to understand their said ideology.

So the far right or the far left are only exposed to far away ideas with far away cliques in a world of far away concepts such that they think far right or far left is the world and whoever believes in anything else is totally wrong. The possible trajectory? Lack of acceptance of election results, political unrest, civil war, implosion. This trend is observed all over the world. Brazil, Italy, Sweden. In Russia and China, the state media is like a big algorithm, and you are only exposed to the ideology that the state allows you to see.

So we created the concept of an algorithm to control the volume of what we see. Now this has split us so that we only see a rabbit hole, a point of view, which we think is absolute. So much so that it can lead to civil war and implosion, which will affect all countries. We wanted to reduce what we see, now we don’t see anything. Not even kangaroo testicles.

Dr. Joanne F Paul is a Lecturer in Pediatric Emergency Medicine at UWI and a Fellow of the TEL Institute

Sharon D. Cole