The 2020s are a strange period to live through. Shifts in digital technologies from the past few years — namely, the ever-increasing presence of recommendation algorithms in every aspect of our digital lives — are shaking up our information environment in ways that can be hard to comprehend. I am wary of following the path of least resistance when adapting to this new reality. I don't want to come off as a complete curmudgeon or doomsday-prepper here; I think these developments have the potential to improve our lives dramatically down the line, assuming we mold them and adapt them to do so. But for right now, big tech companies are pushing the boundaries of profit-motivated black-box recommendation algorithms, and it looks like it's going to be a while before anyone figures out a way to effectively stop them. We need to take our digital wellbeing into our own hands.
The structure of algorithmic social media is designed to expose its users to inflammatory content, reduce user agency when it comes to the content and creators they engage with, and monopolize user attention via purposefully addictive skinner box mechanisms. We should examine our social media use like we would with other harmful, addictive substances and take steps to reduce our dependency on the algorithmic internet.
Let's break each of these points down a bit.
- Exposing users to inflammatory content: Recommendation algorithms are a set of instructions given to a program like TikTok, X, or Youtube that tells it which content it should put onto a given user's feed. Most of these algorithms operate as "black boxes" — while we know what gets fed into the algorithm (your viewing history, the amount of time you spend on an app after seeing various types of posts, your demographic data, etc), and we know what the algorithm is trying to maximize for (generally time spent in-app, aka, the amount of time the user could spend potentailly exposed to advertisements), how the algorithm gets from A to B is often unknown even to the people who designed it. Regardless, it's clear to anyone who has used algorithmic social media that recommendation algorithms have figured out that inflammatory, hateful, and outrageous content leads to higher rates of user interaction and retention. If you've ever wondered why scrolling through your social media feeds makes you feel miserable, hopeless, or angry, bear this in mind: recommendation algorithms do not exist to keep you happy, they exist to keep you scrolling.
- Reducing user agency: Recommendation algorithms encourage users to consume content that is selected for them by the algorithm rather than making choices about which content and creators they wish to engage with. Social media platforms are designed in such a way that the path of least resistance is to simply scroll through and endless stream of algorithm-recommended content. Curating your own content feeds takes time and effort, and depending on the platform you use, may be effectively impossible. From the point-of-view of social media companies, why put the power to choose what you view into the user's hands? When it comes to maximizing advertising revenue, the algorithm always knows best.
- Social media Skinner boxes: For this point I defer to HGModernism's video on social media and Skinner boxes. The Sparknotes version is, we don't just derive pleasure from the content we consume online, we also derive pleasure from the process of finding that content in a sea of algorithmically-generated recommendations. Social media companies know this, of course, and leverage this with innovations like the "endless scroll" that effectively serve to keep users in a constant state of anticipation, like a gambler waiting for the next big win. If you've ever wondered why it's so easy to spend hours scrolling on your phone only to realize at the end of it that you weren't even really enjoying it... this is why.
So, what do we do about this?
Well, obviously there's a lot that needs to be done on a social, political, and legal level. But as far as things that we can do as individuals to maintain our own digital wellbeing, I have a couple suggestions:
- Get off that damn phone. If you grew up around the same time I did, you probably remember all the handwringing in the media about "kids these days spending all their time on their phone." Unfortunately, there's a kernel of truth to this sentiment. Recommendation algorithms prey on negative emotions (see point 1 above) and the more time we spend putting our focus elsewhere, the better. Reducing phone usage can take a lot of different forms — check out the video I Dumb-i-fied my iPhone and Got My Life Back from Digging the Greats on Youtube for a more extreme example — but if you're just looking for something easy to get started, try turning your phone screen to black and white. There's usually an option to do this in your accessibility settings. It's a surprisingly strong deterrent to endless scrolling.
- Cultivate less-algorithmic media feeds. Remember how people used to spend money to subscribe to newspapers and magazines? You can still do that! And if you're looking for something cheaper, try putting together an RSS feed that automatically pulls posts from various publications and websites. Inoreader is what I use, but any decent RSS feed reader will do (unneccesary web-3.0 / AI features notwithstanding).
Anyways, that's all I got for now. Stay well, and remember: your attention is valuable. Be careful about who you give it to.