YouTube is one of, if not the largest information sharing platforms in the world. While not necessarily commonly considered a “Social Media”, it enables people from all around the world to share ideas and experiences and connect to one another, albeit through uploaded video content instead of messages or images. Over 30,000 hours of content are uploaded to YouTube every hour; an impossible amount for any one person to sift through in their lifetime, so YouTube provides viewers with videos based upon a recommendation algorithm. Over 70% of the content consumed on the platform is supplied through this system designed to suggest videos to users based on a variety of factors, including their viewing history, search habits, and engagement metrics like likes, dislikes, and comments.
Historically, the YouTube recommendation algorithm has also been a topic of controversy, particularly regarding political radicalization. Studies found that the algorithm can inadvertently promote extremist content by leading users down a “rabbit hole” to the alt-right through increasingly radical far-right videos. (Manoel Horta Ribeiro, 2020, Bryant, 2020) This likely happened because the algorithm often prioritized content that is highly engaging, which can include sensationalist or polarizing material.
In more recent studies, however, the findings seem to suggest that YouTube has tailored their recommendation algorithm away from this “radicalization pipeline”, adopting a centrist/left-leaning bias, even going as far to make it almost impossible to find far-right content through recommendations if starting from an account that YouTube tagged as being “centrist”. (Ibrahim, 2023). Still, this seems to be a topic of debate, and other recently published studies state the opposite, finding that YouTube still seems to have a radicalizing effect. (Muhammad Haroon, 2023).
These different observations may be due to a difference in methodology: “These conflicting conclusions are due to subtle but crucial differences in the methodologies. Some work relies on active measurements using untrained sock puppets (i.e., without any watch history) and thus cannot capture recommendation processes among actual users.” (Muhammad Haroon, 2023). If this is true, then it raises suspicion on if YouTube has a left-wing bias when Incognito Mode is active, and a right-wing bias when real accounts are used, or if the differences in findings instead are due to changes to YouTube’s algorithm, which like many things in the AI and tech realm, is updated and reworked constantly.
Research Questions: What is the experience nowadays of a user of YouTube coming into contact with Political Content? Is it easy to find on unrelated videos or are specific “clusters” of video topics needed to be traversed to lead you towards a more radical position. Is there a difference in this experience between left and right-wing videos? Does the fact you use an actual account or an incognito account make any difference?
Blue Sky Audits
(Two Blue Sky Audits, one almost impossible, one more realistic but still time-consuming/requiring coding skills I don’t have.)
Bluer Sky Audit: (unrelated to proof-of-concept audit but would be really interesting) Using a browser extension, track the videos of 1000-10000 of people (of varying political leanings and demographics) watch as well as the recommendations they encounter along the way. Record what portion of videos are clicked on from the top 3, top 5, top 10, top 20, etc. video recommendations.
From this data, note what percentage of the videos in those respective categories are politically similar to what the current video is, and what are different. Also note that if videos are different, what are commonalities between the “different” videos between people of similar political views – are there “pre-rabbithole” clusters that exist? What are the common themes of videos that are watched by left- or right-wing people that are not necessarily political content? Are there videos that are suggested to a certain group but never watched? (This part relating to observations I had when conducting my practical experiment, where there were periods of exploring the far-right biased videos where I saw tons of centrist/moderate right leaning news channels)
Blue Sky Audit: (expanded version of proof-of-concept audit) Using the strategies outlined in both Haroon’s and Ibrahim’s studies, use sock puppet accounts, incognito mode accounts, and a handful of hand-trained accounts to re-run both of their experiments simultaneously, as well as an expanded version of the proof-of-concept audit, to see if their findings hold up in the current state of YouTube’s algorithm, and in what scenarios YouTube biases towards left-leaning and right-leaning videos.
Both studies have pretty significantly different methodologies and measurements for what the results are (Ibrahim’s is about being able to “escape” or “get into” different political classifications in YouTube, and how easy or difficult it is to progress from being marked as a “centrist” user to become a “far-right” user. Meanwhile, Haroon’s is about how the video recommendations themselves match with this identifier). I’m especially interested in replicating Haroon’s, since it seems to be quite similar to my practical experiment (although I used an incognito account), but I found results matching Ibrahim’s, with a left-leaning bias being stronger.
Proof-of-Concept Audit
Summary: Starting from a neutral prompt like “Hiking”, how many video recommendations are needed to progress to a Far-Right or Far-Left video topic, and what is the experience like/connected nodes that draw the viewer into the Far-Right or Far-Left rabbit hole?
Using an Incognito account, I loaded up the video How the Appalachian Trail Ruined My Life| Reality after a Thru Hike” and hopped to the “most right-leaning video I could find within the top 5 recommendations in the YouTube sidebar. For the most part, these would often be videos within the same political disposition cluster as the current video, but occasionally it’d be something more radical – I’d repeat this step over and over until I reached a video in the “Alt-Right category” (as defined by the study Auditing Radicalization Pathways On YouTube) (Manoel Horta Ribeiro, 2020), recording every video I went through, as well as what the general “themes” I encountered as I journeyed towards radicalization were. I’d then repeat this process again with the same starting video, but with the goal of finding the most “left-leaning” video in the top 5 recommendations.
After recording the results of those, I tried another prompt to try and see other connections and related clusters to the more radical categories. Starting with “Investment”, I used the video 🔥 INVESTMENT STRATEGY – EXCELLENT PROFIT | Binary Options Investment | Trading Investments as my starting point for both the right and left, curious if it’d be easier to find right-leaning or far-right videos from this more center-right video topic, compared to the “Hiking” prompt’s slightly more center-left political disposition.
A selection of videos from the 1st Right-leaning experiment
Findings: With the first experiment, I hopped through the following content clusters:
Hiking (Start Video, 1 Video(s) Total In Cluster), Hiking Horror (3 Videos), True Crime (3 Videos), Police Drama (4 Videos), Blue Lives Matter Content (2 Videos), Anti-Woke/IDW (7 Videos), Alt-Light/IDW (6 Videos). I was considering expanding the experiment to go further to try and find any of the Alt-Right content as defined by Ribero’s study, but after looking a bit deeper into those channels, every single one has either been taken down by YouTube or the owner of the account was arrested, so I literally could not find any of those channels if I kept this experiment going indefinitely.
What I noticed immediately about the experience was how much I was bombarded with center/center-right news articles as soon as I hit the Blue Lives Matter Content that persisted all the way to the end. For some videos, my top 4 video recommendations were Forbes or CNN news stories and only the 5th was a similarly anti-woke or alt-lite video as the one I was watching.
When repeating the “Hiking” experiment with a left-wing bias, I hopped through the following content clusters: Hiking (Start Video, 1 Video(s) Total In Cluster), Travel (4 Videos), Social Critique (2 Videos), “Breadtube” (5 Videos), Antifascism/Socialist Content (5 Videos).
It was much, much quicker to get to the left-wing videos (around 5-9 videos less depending on what relative comparison is being done between the depth of left- or right-wing content), or at least it seemed that way, and unlike my experience with going towards the far-right, most of the recommended videos weren’t news but other related creators in the same political bracket. An interesting observation was that it took about the same amount of time to reach the “Breadtube” videos (which can be argued to be the limits of the far-left of YouTube, considering they also tend to create antifascist/socialist videos) as it did to reach the “anti-woke” videos – A possible reading of this may then be that the YouTube algorithm doesn’t actually favor either side when viewed with an incognito window and instead it’s simply that there isn’t really a “far-left” pipeline like there is a “far-right” pipeline. That is, that the far-right is much more radical and farther from centrist than the far-left is, and the increase in videos required to reach it is reflective of the bigger jump on the political axis from center to far-right extreme.
I won’t go into as much detail about the experiments with “Investment” as a starting point, but it didn’t seem to make far right videos any easier to find than far left. If anything, it made the gap wider, but that may be because I was unlucky with the recommendations given to me. I passed through some different routes to get to each extreme though; it seemed to lean more towards stoicism and “manosphere” content for the right, and science and AI for the left. During the “manosphere” Section, instead of a ton of Forbes and CNN news, there were a lot of documentaries (The kind you purchase to view on YouTube) in the recommendations, and this trend continued up until a couple videos into the Anti-Woke/IDW section, after which it returned to news stories again.
Unfortunately, I was unable to continue doing research using this method, since after moving to Thailand, when I watched videos while in Incognito mode, all of my results were skewed towards a Thai-audience. My top 5 recommendations for most videos were primarily filled with primarily Thai-language videos that did not have similar parallels to the political content I had been experimenting on before arriving here.
My biggest surprise with my findings was how different the actual state of the YouTube algorithm was from my previous preconceptions (and the preconceptions of the people I talked to). I think a lot of people are still stuck thinking that the YouTube algorithm has not changed since the Gamergate controversy or when the Alt-lite scene on YouTube was booming. It’s remarkably different how different articles from 2020 and 2023 are when writing about the recommendation algorithm, and I think it shows how much work YouTube has put in towards trying to remove the more harmful aspects of far-right radicalization from the platform.
Compared to other big social media algorithms, YouTube seems to be pulling ahead of Twitter or Facebook and especially platforms like TikTok in terms of Paul Barret’s recommendations to social media platforms. (Barret, 2021) But it’s certainly still not enough – deplatforming alt-right channels and tuning the recommendation algorithm away from far-right content is good, but without the transparency of how the algorithm is doing these things, it may raise concerns about active censorship. “Disclosing what they’re doing, how they’re doing it, and what content might potentially get blocked in the process is the only way the platforms can counter suspicions that such measures are designed to manipulate politics or otherwise exert illegitimate influence.” (Barret, 2021).
In conclusion, I feel weirdly positive about the current state of YouTube’s algorithm. I feel like it fails less egregiously compared to most other similar platforms and does put an active effort into countering misinformation via its additional resources on vaccines/climate change/the holocaust/etc underneath videos, and discourages right-wing radicalization by slowing down the progress to get to the extremes while filling the recommendations with more centrist news. It’s still not great, and there is always the possibility of changes to the algorithm causing a reversal on these efforts, but compared to platforms like Facebook or Twitter whose CEOs are actively using to spread misinformation and radicalize the userbase, YouTube has a pretty good stance on things all things considered.
Works Cited
Manoel Horta Ribeiro, R. O. (2020). Auditing Radicalization Pathways on YouTube. Retrived from arXiv, 11.
Muhammad Haroon, M. W. (2023). Auditing YouTube’s recommendation system for ideologically congenial, extreme, and problematic recommendations. PNAS, 8.
Bryant, L. (2020). The YouTube Algorithm and the Alt-Right Filter Bubble.
Open Information Science, 4(1), 85-90. https://doi.org/10.1515/opis-2020-0007
Ibrahim, H., Al Dahoul, N., Lee, S., Rahwan, T., & Zaki, Y. (2023). YouTube’s recommendation algorithm is left-leaning in the United States. PNAS nexus, 2(8), pgad264. https://doi.org/10.1093/pnasnexus/pgad264
Deena Abul-Fottouh, Melodie Yunju Song, Anatoliy Gruzd (2020), Examining algorithmic biases in YouTube’s recommendations of vaccine videos, International Journal of Medical Informatics
Paul Barret, Justin Hendrix, J. Grant Sims. (2021), “Fueling the Fire: How Social Media Intensifies Political PolarizationLinks to an external site. NYU Stern.