It’s time to ditch our YouTube rabbit hole obsession

Our most popular narrative about YouTube is keeping us from truly understanding the platform
Essays
Author

Slater Dixon

Published

January 15, 2023

In March 2018, Zeynep Tufekci wrote what is now a bona-fide truism — “YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.”

The rabbit hole theory works something like this: to maximize engagement, YouTube recommends progressively extreme political content to users would not otherwise seek it out. Come for the cat videos, stay for the white-supremacy.

Since 2018, this theory has dominated discussions about the platform. But studies have produced mixed results, but most gesture towards a basic conclusion — the idea that YouTube’s algorithm singularly radicalizes its users is overly simplistic, if not flat-out wrong.

For example, Brendan Nyhan and a team of researchers’s found “limited evidence” in the literature and their own study. They observed the algorithm reliably fed problematic content to a small minority of users who were already predisposed to seek it out. These “superusers” consumed a vast majority of extreme online content and scored highly on measures of racial resentment. Most recently, NYU researchers “[did] not find evidence that many [users] go down ‘rabbit holes’ that lead them to ideologically extreme content.”

To some extent, it’s not super important whether or not the YouTube algorithm was radicalizing people in 2017. Not only is it hard to prove, but the claim of a linear rabbit holes breaks our understanding of the platform. Despite being the dominant theoretical paradigm used to evaluate YouTube, the theory is deeply flawed, imparting premises and assumptions that preclude any nuanced exploration of the platform’s dynamics.

For example, Becca Lewis has pointed out that the model “bakes in unproductive assumptions” about YouTube, making it hard explore complicated trends on the platform. For example, she argues the rabbit hole narrative assumes extreme content is easy to define and clearly delineated from mainstream politics. In endorsing the rabbit hole model, Lewis says, “We suggest that there is a stable dark underbelly of YouTube that would remain largely untouched except that the recommendation algorithm draws in users.”

The notion of “far-right radicalization” on YouTube is also prone to fundamental descriptive issues — what does “extreme” even mean? What makes a channel fringe? Our rabbit hole narrative obscures value judgements underlying these assessments. As is evident in other areas of platform governance, one person’s “disinformation” is another person’s crucial political speech.

What would happen if, instead of blaming undesirable political trends on the YouTube algorithm, we studied that platform within a larger political context?

Kevin Munger has observed that anti-establishment sentiment is high in American politics, but even higher on YouTube, where users create an “alternative political canon and community…[that] coalesce around…novel worldviews.” He argues we should study the platform as a way to explore political disillusionment as a phenomenon. The question of why far-right content is so prevalent on YouTube is not, Munger argues, the same as the question of whether the rabbit hole exists. And it’s certainly worth asking, considering the enormous scale of the platform.

In many ways, the rabbit hole is relic of a time when sweeping theories about what technology was “doing to us” and our politics were in high demand. The “zombie-bite” model still looms over scholarly research surrounding YouTube. Besides being (probably) wrong, this theory slams the door on any analysis that explores the complexities of the platform within a broader sociopolitical context. It would behoove those studying YouTube to ditch the rabbit hole narrative altogether.


This post comes from a presentation I gave in Spring 2022. I’d like to thank my professor Dr. Rocki Wentzel for her help, as well as Kevin Munger and Becca Lewis, who were generous enough to speak with me on the topic.