How “The Algorithm” Builds Toxic Mental Health Echo Chambers

CW: mental health, suicide, eating disorders

If you’re anything like me, you have somewhat of a love-hate relationship with “the algorithm”. On the one hand, I get shown content of the variety I’m partial to on the regular. I’m into houseplants and calligraphy, and the algorithm knows that, so I rather like coming across aesthetically pleasing calligraphy videos on YouTube. On the other hand, I’m a little creeped out that the algorithm knows me so well, and I know that it can serve to perpetuate harmful ideas (as discussed in Noble’s article). On a sillier level, I don’t exactly appreciate getting called out on the regular by other young adults with mental health issues on the internet. 

Actually, interacting (AKA, liking/commenting) with that last type of video can easily trigger another aspect of “the algorithm” that I’m less enthused about: the funneling of impressionable young people into misguided mental health spaces. These are online spaces (comment sections, users’ personal pages, group accounts) wherein often unqualified young adults and teens discuss mental health. Users will make videos prompting others to relate to symptoms of neurodevelopmental disorders or mental illnesses, poke fun at their own mental health challenges, and sometimes glamorize the idea of being deeply unhappy— even suicidal.

Right now, this subculture is having a bit of a moment on TikTok, but it’s certainly not anything new. I’m sure my classmates remember 2012-era Eating Disorder Tumblr

A mild example of what a search of "thinspo" on Tumblr yields.

A mild example of toxic eating disorder culture on Tumblr.

I’m not trying to say that this side of the internet is all bad, though. Users often also share tips or tricks that help make daily tasks easier to accomplish, or encourage people to seek professional help if they are struggling. Other users are actual medical professionals or therapists doing their best to offer useful advice. It’s also just nice to know that you’re not alone in your problems. I know I’ve also found solace knowing I’m not the only one experiencing feelings I thought were uniquely mine to bear, or that I’m not the only one who worries about [insert silly thing].

All I’m trying to say is that, when “the algorithm” aggressively directs users to these kind of mental health spaces and subsequently feeds them often misguided and toxic information, things can quickly get ugly. Vulnerable young people have been known to develop eating disorders or pick up inadvisable coping mechanisms as a result of interacting in such online spaces. And because they continue to interact with such content, these young people can find it extremely difficult to break out of these toxic bubbles. Instead, they get stuck in this nightmarish echo chamber full of other sad teens who are just trying to feel okay in a confusing, scary world. 

It’s this echo chamber effect created by “the algorithm” that worries me most. It certainly isn’t limited to mental health discourse: social and political echo chambers exist all over the internet. Laquintano and Vee describe how “the algorithm” affected the circulation of political information ahead of the 2016 election in their article. These spaces can similarly serve to promote misguided ideologies (such as glorifying cults).

Echo chamber | Cartoons |

A political cartoon showing a modern example of how social media creates echo chambers. Illustration by Robert Ariall on

Generally though, echo chambers of any kind do one thing best: they echo. They repeat the same few ideas and opinions over and over and over again. And when those ideas are harmful, bad things happen. Real-world problems start to occur, and perhaps just as importantly, young people who’ve fallen prey to this algorithmic shepherding are prevented from seeing that there are other parts of life— online and off—  that are better than this. Even beautiful. This isn’t all there is. Some things matter way more important than the circumference of your wrists.

I don’t have a solution to this shepherding problem. Do we need more content censorship so that harmful information never ends up online in the first place? Or is that an infringement upon free speech? Should we “dial back” how aggressively the algorithm picks up on browsing patterns and herds us into groups? I don’t know. But I’m confident that we could all benefit from stepping outside our online bubbles, even if we don’t think we’re in a harmful or hateful space. Perspective is key: your slice of the internet is never all there is. The internet can be a tool for good, if we use it that way.

Leave a Reply

Your email address will not be published. Required fields are marked *