The Politics of Social Media Algorithms
a note from the author
In the 21st century, social media algorithms control the flow of much of the world’s information and social connection. And while we have never been more digitally connected as a civilization, it is important to consider the underlying technology behind what drives these connections. Who controls what we see on social media? Why are we recommended certain content? For many people, their world view and perspective is at the mercy of their social media feed and thus the algorithms that have populated it. What consequences are intentional and what are unintentional, and how does this impact the different groups affected by this technology? We will explore all of these questions in this assignment.
In his journal article, Langdon Winner defines politics as “arrangements of power and authority in human associations as well as the activities that take place within those arrangements” (Winner, 1980). Examining social media algorithms through this definition, we can classify them as highly political. The control over content distribution aligns with Winner’s concept of arrangements of power. And this power is wielded to impose specific values onto their users. The main value that social media algorithms embody is profit maximization, which in turn is driven by engagement and attention. And over the years, social media companies have optimized the algorithm to drive as much attention as they can to their platforms. Joanna Stern encapsulates this evolution in her article, saying “bye-bye, feeds that showed everything and everyone we followed in an unending, chronologically ordered river. Hello, high-energy feeds that popped with must-clicks” (Stern, 2021). This makes a lot of sense when you consider the motivations and politics of those driving these changes. For social media companies, the more engaging their platforms are, the more ads they can sell. “No algorithms, no ads” (Stern, 2021).
It is one thing to create an engaging feed. It’s another to have a completely personalized and individual feed for each user, fine-tuned to retain that user for as long as possible. How is it that these algorithms can almost read our minds? Ben Smith investigated this in his article about TikTok, in which he found internal company documents that detail this elaborate system. These documents illustrate that “for each video a kid watches, TikTok gains a piece of information on him. In a few hours, the algorithm can detect his musical tastes, his physical attraction, if he’s depressed, if he might be into drugs, and many other sensitive information. There’s a high risk that some of this information will be used against him. It could potentially be used to micro-target him or make him more addicted to the platform” (Smith, 2021). TikTok also outlines their creator monetization program, in which they pay creators based on certain metrics, such as “likes, comments and playtime, as well as an indication that the video has been played” (Smith, 2021). Creators are rewarded for engagement and attention, which furthers the profit maximization value that social media algorithms embody.
All things considered, it’s clear that companies intentionally make their apps addictive and they can control how their algorithms work. However, they have (or claim to have) very little control over what content is actually used to engage users and the impact that content has. For example, Horwitz and Blunt of the Wall Street Journal explored Instagram Reels and tested the content and advertisements they were shown after following select accounts. “The Journal reporters set up the Instagram test accounts as adults on newly purchased devices and followed the gymnasts, cheerleaders and other young influencers. The tests showed that following only the young girls triggered Instagram to begin serving videos from accounts promoting adult sex content alongside ads for major consumer brands, such as one for Walmart that ran after a video of a woman exposing her crotch” (Horwitz & Blunt, 2023). Did Walmart intentionally choose to place their ads after this type of content? Did Instagram intentionally show this kind of content to this type of user? To both answers, most likely no. But there also is no doubt these algorithms have grown beyond any one person’s control. “When the test accounts then followed some users who followed those same young people’s accounts, they yielded even more disturbing recommendations. The platform served a mix of adult pornography and child-sexualizing material, such as a video of a clothed girl caressing her torso and another of a child pantomiming a sex act” (Horwitz & Blunt, 2023). The hypothesis of the reporters was that Instagram’s “behavioral tracking has discerned that some Instagram users following preteen girls will want to engage with videos sexualizing children, and then directs such content toward them” (Horwitz & Blunt, 2023). While social media algorithms are intentionally addictive, the type of content used to drive this addiction can oftentimes be unintentional.
The alarming consequences of these unintentional effects have become apparent. As the algorithms prioritize content with high shock value, be it misinformation or sexually explicit material, and users influence the direction of such content, creators perpetuate the cycle by producing content that aligns with these preferences. While this vicious cycle of social media algorithms affect users and creators alike, children and adolescents using social media are one of the most impacted groups. Junior Senator Chris Murphy states that now more than ever, impressionable children have relinquished “their online autonomy so fully to their phones that they even balk at the idea of searching the internet — for them, the only acceptable online environment is one customized by big tech algorithms, which feed them customized content” (Murphy, 2023). Murphy emphasizes that an unregulated access to social media algorithms has hurt the mental health and overall well-being of millions of kids. Whether it’s intentional or not, there are major implications for young kids affected by social media algorithms.
In the end, the extensive influence of social media algorithms, driven by profit motives and engagement metrics, is highly political. These algorithms, designed to capture attention and maximize revenue, shape the information landscape and wield significant power over users’ experience. Recognizing the nature of this technology is essential for creating a digital environment that prioritizes transparency, ethics, and the well-being of users.
References
2023
- Nov 2023 · The Wall Street Journal
-
2021
- Jan 2021 · The Wall Street Journal
-
1980
Liked that one?
Check these out if you haven't already: