The Kids Are Alt-Right

Whether you’re into crafts and DIY, boybands, gaming, or grilling, chances are you’ve watched a YouTube video about it before. YouTube is a video-sharing platform and the second largest search engine behind Google Search. Users watch over a billion hours of content on the site every day.

This post from our course blog discusses a growing issue on social media platforms–The Algorithm. Clicks = Ad $$ and algorithms reflect that. The echo chamber, or filter bubble, or whatever you want to call it, that is born from aggressive algorithms can be dangerous. Once you engage with certain content, similar content starts popping up more, and users are recommended increasingly extreme content.

Safiya Noble’s “Google Search” interrogates the algorithmic practices of biasing information through search engine results, specifically concerning how Black women and girls are rendered online. Noble  states an ugly truth: “…search engine technology replicates and instantiates derogatory notions.”

Search results for the word “feminist” in YouTube Search.

TikTok-ers have recently been posting about such a phenomenon on YouTube, particularly affecting teenage boys, known as the “Alt-Right Pipeline.”

PewDiePie, a gaming channel, has been known as an entry to falling down the alt-right rabbit hole. “Edgy humor” becomes increasingly blurred with hate speech, and compilations of SJW/Feminist/whoever gets destroyed/owned/whatever becomes all you see. These subcultures are fed by content creators that promote each other and their other social media platforms. In an extreme instance, a shooter live-streamed his attack on a mosque and told viewers, “Remember lads, subscribe to PewDiePie.” In the past 4 years, alt-right groups have grown emboldened by support from former President Donald Trump.

The rise of the alt-right is both a continuation of a centuries-old dimension of racism in the U.S. and part of an emerging media ecosystem powered by algorithms.
Going through an “Alt-right phase” isn’t quirky or relatable. Interacting with these ideologies has real-life dangerous consequences.
In an effort to engage users as much as possible, we are left with the consequences of algorithms gone wild. Companies need to be more transparent about their algorithms, and actively work to improve them to be anti-racist. Additionally, we need to examine more closely the relationship entertainment and education have online. As we click, and click, and click, companies lead us down extremist rabbit holes, and profit all the while.

Wikipedia Changed Over Time and So Has Its Critics

Wikipedia, and concerns over its reliability and accuracy, have been discussed on this blog before. It was also a common mantra among my teachers in public school to warn about the dangers of Wikipedia, because, as they said, anyone can go on there and write anything they want. As my fellow blogger and Pfister showed, however, editing Wikipedia was a much more complicated process than was often claimed. Still, I was curious as to how the perception of Wikipedia has changed over time and the results were rather interesting.

Being an online encyclopedia that is constantly updated and not subject to a single person’s control puts Wikipedia in a unique situation regarding its critics, because Wikipedia has an entry about itself. And since many articles are written by those who have no financial stake in the reputation of the website, Wikipedia’s own Wikipedia page does not shy away from common criticisms that have emerged concerning the site over the years.

The usual suspects are there, such as claims of inaccuracy and unreliability, and how educators often ban it from being cited in student’s papers. Other criticisms are present as well, ones that are not heard as often, such as privacy issues surrounding private citizens that have their own articles. There is also the way Wikipedia allows graphic and explicit content on its pages that could easily be accessed by children. Some of these criticisms could apply to the internet as a whole, and Wikipedia, being one of the world’s most famous websites, is simply a more prominent target.

Google and Wikipedia logos in person's hands

It should be noted that, regardless of what educators and other intellectuals thought of Wikipedia in its early days, it was always popular with the average person. If it wasn’t then it would not have evoked the kind of reaction it did in schools, nor would the site have become as popular as it did. But opinions of Wikipedia among the media have changed since it first went online. It was better received in the 2010s, throwing off the appeal to tradition some of its critics often relied on.

What surprised me the most when reading about this was how Facebook, Google, and YouTube now link to Wikipedia to help people decipher truth from falsehood. While these sites’ opinions of Wikipedia are hardly the be-all-end-all—and some of the criticisms listed above might still be valid—it is ironic to see the site that was once denounced as unreliable now held up as a standard of credibility. Maybe new forms of writing simply take time to be accepted.

Playing the algorithm: How it can backfire

On YouTube, there is one universal rule for all creators: eventually, you’ll have to make an apology to address a controversy you may find yourself embroiled in. These scandals can range from mostly harmless missteps to being involved in actual criminal offenses. A very recent example of this is David Dobrik, who released two separate apology videos addressing a sexual assault incident he both facilitated and filmed, along with a variety of other allegations.

Like many creators, Dobrik knew that addressing the controversy could lead to more fans finding out about it. Therefore, he released his first apology, title “Let’s Talk” on his least followed channel, disabling likes and dislikes. More than likely, he was hoping that this would be enough of a response for fans asking for him to address the allegations, but not be seen by most people.

David Dobrik looking stupid

A screenshot from one of Dobrik’s apology videos. Notice how he left in him crying and how he is sitting on the floor, common tropes in the youtube apology genre.

Unfortunately for him, the video ended up on the YouTube trending page, and his clear attempt to manipulate the algorithm led to more people speaking out about the issue. He eventually had to make another apology, which still met some controversy. Dobrik has now lost all sponsorships and had to step away from his app, Dispo.

As Timothy Laquintano and Annette Vee discussed, automated systems greatly affect our writing and communication systems, and I believe this is a great example of this. Dobrik has thrived off the algorithm. He even sold merchandise with “clickbait” on it, showing how he works to manipulate YouTube’s automatic system. His name or face attached to a project automatically makes the algorithm more favorable to a piece of media, and he has famously not responded to scandals in the past to avoid negative associations with his name.

a red hoodie with the word "clickbait" on it

This has clearly led to his downfall, however. By playing the algorithm to boost his name recognition, Dobrik has made it even easier for others to call him out. He cannot hide his apology, even from automated systems. In my opinion, he is finally getting what he deserves.