Psychology experts urge social media giants to increase transparency around algorithms to protect users’ mental health

In a new article published in the journal body image, a team of psychology researchers describes a mountain of evidence linking social media use to body image issues. Researchers describe how algorithms can intensify this connection and urge social media companies to take action.

Appearance-based social media platforms like TikTok seem to be particularly harmful to users’ body image. On these platforms, teens are continuously exposed to filtered and edited content that features unrealistic body standards. According to recent evidence, this distorted environment increases the risk of body dissatisfaction and harmful conditions such as body dysmorphia and eating disorders.

“I’m interested in the risk and protective factors of body image, and some of my more recent research has focused on the role of social media,” explained lead author Jennifer A. Harriger, professor of psychology at Pepperdine University. “I became interested in the use of algorithms by social media companies, and the revelations from whistleblowers that companies were aware of the harm their platforms were causing to young users. This article was written as a call to arms for social media companies, researchers, influencers, parents, educators, and clinicians. We need to do a better job of protecting our young people.

In their report, Harriger and his team explain that these effects can be exacerbated by social media algorithms that personalize the content presented to users. These algorithms “rabbit hole” users into more extreme content that is less monitored and designed to keep them on the platform.

It is important to note that the damage caused by these algorithms is not unknown to social media companies, as evidenced by recent testimonials from whistleblowers. Former Facebook executive Frances Haugen has leaked documents revealing the social media giant was aware of research linking its products to mental health and body image issues in teens. A TikTok whistleblower then leaked evidence of an algorithm that carefully manipulates the content presented to users, prioritizing emotionally triggering content in order to keep them engaged.

“Social media platforms can be valuable opportunities to connect with others, and users have the ability to personalize their own experiences (choosing what content to follow or interact with); but social media platforms also have drawbacks. One such downside is the company’s use of algorithms designed to keep the user engaged for longer periods of time,” Harriger told PsyPost.

“Social media companies are aware of the damage caused by their platforms and their use of algorithms but have not made efforts to protect users. Until these companies become more transparent about the use of their algorithms and offer users the ability to opt out of content they don’t want to see, users are at risk. One way to minimize risk is to only follow accounts that have a positive influence on mental and physical health and to block triggering or negative content.

In their article, Harriger and his colleagues present recommendations to combat these algorithms and protect the mental health of social media users. First, they point out that the main responsibility lies with the social media companies themselves. The authors reiterate suggestions from the Academy for Eating Disorders (AED), saying that social media companies should increase the transparency of their algorithms, take steps to remove accounts sharing eating disorder content, and make their research data more publicly available.

The researchers add that social media platforms should disclose to users why the content they see in their feeds was chosen. They should also limit micro-targeting, a marketing strategy that targets specific users based on their personal data. Additionally, these companies are socially responsible for the well-being of their users and should take steps to increase awareness of weight stigma. This can be done by consulting body image and eating disorder experts on ways to encourage positive body image among users, perhaps by promoting body positive content on the platform.

Next, influencers can also play a role in impacting the body image and well-being of their followers. Harriger and her colleagues suggest influencers should also consult with body image experts for guidance on body-positive messaging. Positive actions can include educating their audience about social media algorithms and encouraging them to fight the negative effects of algorithms by following and engaging with body-positive content.

Researchers, educators, and clinicians can examine ways to prevent the negative impact of social media on body image. “It is difficult to empirically research the effect of algorithms, because each user’s experience is personalized based on their interests (eg, what they have clicked on or viewed in the past),” Harriger noted. “Research can, however, examine the use of media literacy programs that address the role of algorithms and equip young users with tools to protect their well-being on social media.”

Such research can help inform social media literacy programs that teach teens about social media advertising, encourage them to use critical thinking when participating in social media, and teach them strategies to increase positive content. displayed in their feeds.

Parents can teach their children positive social media habits by modeling healthy behavior with their own electronic devices and setting rules and boundaries around their children’s social media use. They can also organize discussions with their children on issues such as social media image editing and algorithms.

Overall, the researchers conclude that social media companies bear the ultimate responsibility for protecting the well-being of their users. “We emphasize that system-level changes must occur so that individual users can effectively do their part to preserve their own body image and well-being,” report the researchers. “Social media companies must be transparent about how content is delivered if algorithms continue to be used, and they must provide clear ways for users to easily opt out of content they do not want to see.”

The study, “The dangers of the rabbit hole: reflections on social media as a portal to a distorted world of altered bodies and the risk of eating disorders and the role of algorithmswas written by Jennifer A. Harriger, Joshua A. Evans, J. Kevin Thompson, and Tracy L. Tylka.

Sharon D. Cole