{"id":33,"date":"2024-02-23T06:01:19","date_gmt":"2024-02-23T06:01:19","guid":{"rendered":"https:\/\/responsiblepraxis.ai\/?page_id=33"},"modified":"2024-04-13T03:05:41","modified_gmt":"2024-04-13T03:05:41","slug":"ty-greacen-youtube-recommendation-engine","status":"publish","type":"page","link":"https:\/\/responsiblepraxis.ai\/?page_id=33","title":{"rendered":"Ty Greacen &#8211; YouTube Recommendation Engine"},"content":{"rendered":"\n<p>YouTube is one of, if not the largest information sharing platforms in the world. While not necessarily commonly considered a \u201cSocial Media\u201d, it enables people from all around the world to share ideas and experiences and connect to one another, albeit through uploaded video content instead of messages or images. Over 30,000 hours of content are uploaded to YouTube every hour; an impossible amount for any one person to sift through in their lifetime, so YouTube provides viewers with videos based upon a recommendation algorithm. Over 70% of the content consumed on the platform is supplied through this system designed to suggest videos to users based on a variety of factors, including their viewing history, search habits, and engagement metrics like likes, dislikes, and comments.&nbsp;<\/p>\n\n\n\n<p>Historically, the YouTube recommendation algorithm has also been a topic of controversy, particularly regarding political radicalization. Studies found that the algorithm can inadvertently promote extremist content by leading users down a &#8220;rabbit hole&#8221; to the alt-right through increasingly radical far-right videos.&nbsp;(Manoel Horta Ribeiro, 2020, Bryant, 2020)&nbsp;This likely happened because the algorithm often prioritized content that is highly engaging, which can include sensationalist or polarizing material.<\/p>\n\n\n\n<p>In more recent studies, however, the findings seem to suggest that YouTube has tailored their recommendation algorithm away from this \u201cradicalization pipeline\u201d, adopting a centrist\/left-leaning bias, even going as far to make it almost impossible to find far-right content through recommendations if starting from an account that YouTube tagged as being \u201ccentrist\u201d. (Ibrahim, 2023). Still, this seems to be a topic of debate, and other recently published studies state the opposite, finding that YouTube still seems to have a radicalizing effect.&nbsp;(Muhammad Haroon, 2023).&nbsp;<\/p>\n\n\n\n<p>These different observations may be due to a difference in methodology: \u201cThese<em>&nbsp;conflicting conclusions are due to subtle but crucial differences in the methodologies. Some work relies on active measurements using untrained sock puppets (i.e., without any watch history) and thus cannot capture recommendation processes among actual users.\u201d&nbsp;(Muhammad Haroon, 2023).<\/em>&nbsp;If this is true, then it raises suspicion on if YouTube has a left-wing bias when Incognito Mode is active, and a right-wing bias when real accounts are used, or if the differences in findings instead are due to changes to YouTube\u2019s algorithm, which like many things in the AI and tech realm, is updated and reworked constantly.<\/p>\n\n\n\n<p>Research Questions: What is the experience nowadays of a user of YouTube coming into contact with Political Content? Is it easy to find on unrelated videos or are specific \u201cclusters\u201d of video topics needed to be traversed to lead you towards a more radical position. Is there a difference in this experience between left and right-wing videos? Does the fact you use an actual account or an incognito account make any difference?<\/p>\n\n\n\n<p>Blue Sky Audits&nbsp;<br>(Two Blue Sky Audits, one almost impossible, one more realistic but still time-consuming\/requiring coding skills I don\u2019t have.)<\/p>\n\n\n\n<p>Bluer Sky Audit: (unrelated to proof-of-concept audit but would be really interesting) Using a browser extension, track the videos of 1000-10000 of people (of varying political leanings and demographics) watch as well as the recommendations they encounter along the way. Record what portion of videos are clicked on from the top 3, top 5, top 10, top 20, etc. video recommendations.<\/p>\n\n\n\n<p>&nbsp;From this data, note what percentage of the videos in those respective categories are politically similar to what the current video is, and what are different. Also note that if videos are different, what are commonalities between the \u201cdifferent\u201d videos between people of similar political views \u2013 are there \u201cpre-rabbithole\u201d clusters that exist? What are the common themes of videos that are watched by left- or right-wing people that are not necessarily political content? Are there videos that are suggested to a certain group but never watched? (This part relating to observations I had when conducting my practical experiment, where there were periods of exploring the far-right biased videos where I saw tons of centrist\/moderate right leaning news channels)<\/p>\n\n\n\n<p>Blue Sky Audit: (expanded version of proof-of-concept audit) Using the strategies outlined in both Haroon\u2019s and Ibrahim\u2019s studies, use sock puppet accounts, incognito mode accounts, and a handful of hand-trained accounts to re-run both of their experiments simultaneously, as well as an expanded version of the proof-of-concept audit, to see if their findings hold up in the current state of YouTube\u2019s algorithm, and in what scenarios YouTube biases towards left-leaning and right-leaning videos.&nbsp;<\/p>\n\n\n\n<p>Both studies have pretty significantly different methodologies and measurements for what the results are (Ibrahim\u2019s is about being able to \u201cescape\u201d or \u201cget into\u201d different political classifications in YouTube, and how easy or difficult it is to progress from being marked as a \u201ccentrist\u201d user to become a \u201cfar-right\u201d user. Meanwhile, Haroon\u2019s is about how the video recommendations themselves match with this identifier). I\u2019m especially interested in replicating Haroon\u2019s, since it seems to be quite similar to my practical experiment (although I used an incognito account), but I found results matching Ibrahim\u2019s, with a left-leaning bias being stronger.<\/p>\n\n\n\n<p>Proof-of-Concept Audit<\/p>\n\n\n\n<p>Summary: Starting from a neutral prompt like \u201cHiking\u201d, how many video recommendations are needed to progress to a Far-Right or Far-Left video topic, and what is the experience like\/connected nodes that draw the viewer into the Far-Right or Far-Left rabbit hole?<\/p>\n\n\n\n<p>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Using an Incognito account, I loaded up the video&nbsp;<em>How the Appalachian Trail Ruined My Life| Reality after a Thru Hike\u201d and hopped to the \u201cmost right-leaning&nbsp;<\/em>video I could find within the top 5 recommendations in the YouTube sidebar. For the most part, these would often be videos within the same political disposition cluster as the current video, but occasionally it\u2019d be something more radical \u2013 I\u2019d repeat this step over and over until I reached a video in the \u201cAlt-Right category\u201d (as defined by the study&nbsp;<em>Auditing Radicalization Pathways On YouTube<\/em>)&nbsp;(Manoel Horta Ribeiro, 2020), recording every video I went through, as well as what the general \u201cthemes\u201d I encountered as I journeyed towards radicalization were. I\u2019d then repeat this process again with the same starting video, but with the goal of finding the most \u201cleft-leaning\u201d video in the top 5 recommendations.<\/p>\n\n\n\n<p>After recording the results of those, I tried another prompt to try and see other connections and related clusters to the more radical categories. Starting with \u201cInvestment\u201d, I used the video&nbsp;<em>\ud83d\udd25<\/em><em>&nbsp;INVESTMENT STRATEGY &#8211; EXCELLENT PROFIT | Binary Options Investment | Trading Investments<\/em>&nbsp;as my starting point for both the right and left, curious if it\u2019d be easier to find right-leaning or far-right videos from this more center-right video topic, compared to the \u201cHiking\u201d prompt\u2019s slightly more center-left political disposition.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"blob:https:\/\/responsiblepraxis.ai\/a53f94f5-12c7-49a3-9b3b-13cdc1163997\" alt=\"A screenshot of a video\n\nDescription automatically generated\"\/><\/figure>\n\n\n\n<p>A selection of videos from the 1st Right-leaning experiment<\/p>\n\n\n\n<p>Findings: With the first experiment, I hopped through the following content clusters:<br>Hiking (Start Video, 1 Video(s) Total In Cluster), Hiking Horror (3 Videos), True Crime (3 Videos), Police Drama (4 Videos), Blue Lives Matter Content (2 Videos), Anti-Woke\/IDW (7 Videos), Alt-Light\/IDW (6 Videos). I was considering expanding the experiment to go further to try and find any of the Alt-Right content as defined by Ribero\u2019s study, but after looking a bit deeper into those channels, every single one has either been taken down by YouTube or the owner of the account was arrested, so I literally could not find any of those channels if I kept this experiment going indefinitely.<\/p>\n\n\n\n<p>What I noticed immediately about the experience was how much I was bombarded with center\/center-right news articles as soon as I hit the Blue Lives Matter Content that persisted all the way to the end. For some videos, my top 4 video recommendations were Forbes or CNN news stories and only the 5th was a similarly anti-woke or alt-lite video as the one I was watching.<\/p>\n\n\n\n<p>When repeating the \u201cHiking\u201d experiment with a left-wing bias, I hopped through the following content clusters: Hiking (Start Video, 1 Video(s) Total In Cluster), Travel (4 Videos), Social Critique (2 Videos), \u201cBreadtube\u201d (5 Videos), Antifascism\/Socialist Content (5 Videos).<\/p>\n\n\n\n<p>It was much, much quicker to get to the left-wing videos (around 5-9 videos less depending on what relative comparison is being done between the depth of left- or right-wing content), or at least it seemed that way, and unlike my experience with going towards the far-right, most of the recommended videos weren\u2019t news but other related creators in the same political bracket. An interesting observation was that it took about the same amount of time to reach the \u201cBreadtube\u201d videos (which can be argued to be the limits of the far-left of YouTube, considering they also tend to create antifascist\/socialist videos) as it did to reach the \u201canti-woke\u201d videos \u2013 A possible reading of this may then be that the YouTube algorithm doesn\u2019t actually favor either side when viewed with an incognito window and instead it\u2019s simply that there isn\u2019t really a \u201cfar-left\u201d pipeline like there is a \u201cfar-right\u201d pipeline. That is, that the far-right is much more radical and farther from centrist than the far-left is, and the increase in videos required to reach it is reflective of the bigger jump on the political axis from center to far-right extreme.<\/p>\n\n\n\n<p>I won\u2019t go into as much detail about the experiments with \u201cInvestment\u201d as a starting point, but it didn\u2019t seem to make far right videos any easier to find than far left. If anything, it made the gap wider, but that may be because I was unlucky with the recommendations given to me. I passed through some different routes to get to each extreme though; it seemed to lean more towards stoicism and \u201cmanosphere\u201d content for the right, and science and AI for the left. During the \u201cmanosphere\u201d Section, instead of a ton of Forbes and CNN news, there were a lot of documentaries (The kind you purchase to view on YouTube) in the recommendations, and this trend continued up until a couple videos into the Anti-Woke\/IDW section, after which it returned to news stories again.<\/p>\n\n\n\n<p>Unfortunately, I was unable to continue doing research using this method, since after moving to Thailand, when I watched videos while in Incognito mode, all of my results were skewed towards a Thai-audience. My top 5 recommendations for most videos were primarily filled with primarily Thai-language videos that did not have similar parallels to the political content I had been experimenting on before arriving here.<\/p>\n\n\n\n<p>My biggest surprise with my findings was how different the actual state of the YouTube algorithm was from my previous preconceptions (and the preconceptions of the people I talked to). I think a lot of people are still stuck thinking that the YouTube algorithm has not changed since the Gamergate controversy or when the Alt-lite scene on YouTube was booming. It\u2019s remarkably different how different articles from 2020 and 2023 are when writing about the recommendation algorithm, and I think it shows how much work YouTube has put in towards trying to remove the more harmful aspects of far-right radicalization from the platform.<\/p>\n\n\n\n<p>Compared to other big social media algorithms, YouTube seems to be pulling ahead of Twitter or Facebook and especially platforms like TikTok in terms of Paul Barret\u2019s recommendations to social media platforms. (Barret, 2021) But it\u2019s certainly still not enough \u2013 deplatforming alt-right channels and tuning the recommendation algorithm away from far-right content is good, but without the transparency of&nbsp;<em>how<\/em>&nbsp;the algorithm is doing these things, it may raise concerns about active censorship. \u201cDisclosing what they\u2019re doing, how they\u2019re doing it, and what content might potentially get blocked in the process is the only way the platforms can counter suspicions that such measures are designed to manipulate politics or otherwise exert illegitimate influence.\u201d (Barret, 2021).&nbsp;<\/p>\n\n\n\n<p>In conclusion, I feel weirdly positive about the current state of YouTube\u2019s algorithm. I feel like it fails less egregiously compared to most other similar platforms and does put an active effort into countering misinformation via its additional resources on vaccines\/climate change\/the holocaust\/etc underneath videos, and discourages right-wing radicalization by slowing down the progress to get to the extremes while filling the recommendations with more centrist news. It\u2019s still not great, and there is always the possibility of changes to the algorithm causing a reversal on these efforts, but compared to platforms like Facebook or Twitter whose CEOs are actively using to spread misinformation and radicalize the userbase, YouTube has a pretty good stance on things all things considered.<\/p>\n\n\n\n<h1 class=\"wp-block-heading\">Works Cited<\/h1>\n\n\n\n<p>Manoel Horta Ribeiro, R. O. (2020). Auditing Radicalization Pathways on YouTube.&nbsp;<em>Retrived from arXiv<\/em>, 11.<\/p>\n\n\n\n<p>Muhammad Haroon, M. W. (2023). Auditing YouTube\u2019s recommendation system for ideologically congenial, extreme, and problematic recommendations.&nbsp;<em>PNAS<\/em>, 8.<\/p>\n\n\n\n<p>Bryant, L. (2020). The YouTube Algorithm and the Alt-Right Filter Bubble.&nbsp;<br><em>Open Information Science<\/em>, 4(1), 85-90.&nbsp;<a href=\"https:\/\/doi.org\/10.1515\/opis-2020-0007\">https:\/\/doi.org\/10.1515\/opis-2020-0007<\/a><\/p>\n\n\n\n<p>Ibrahim, H., Al Dahoul, N., Lee, S., Rahwan, T., &amp; Zaki, Y. (2023). YouTube&#8217;s recommendation algorithm is left-leaning in the United States.&nbsp;<em>PNAS nexus<\/em>,&nbsp;2(8), pgad264.&nbsp;<a href=\"https:\/\/doi.org\/10.1093\/pnasnexus\/pgad264\">https:\/\/doi.org\/10.1093\/pnasnexus\/pgad264<\/a><\/p>\n\n\n\n<p>Deena Abul-Fottouh, Melodie Yunju Song, Anatoliy Gruzd (2020), Examining algorithmic biases in YouTube\u2019s recommendations of vaccine videos,&nbsp;<em>International Journal of Medical Informatics<\/em><\/p>\n\n\n\n<p>Paul Barret, Justin Hendrix, J. Grant Sims. (2021), &#8220;Fueling the Fire: How Social Media Intensifies Political PolarizationLinks to an external site.&nbsp;<em>NYU Stern<\/em>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>YouTube is one of, if not the largest information sharing platforms in the world. While not necessarily commonly considered a \u201cSocial Media\u201d, it enables people from all around the world to share ideas and experiences and connect to one another, albeit through uploaded video content instead of messages or images. Over 30,000 hours of content [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":9,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-33","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/responsiblepraxis.ai\/index.php?rest_route=\/wp\/v2\/pages\/33","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/responsiblepraxis.ai\/index.php?rest_route=\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/responsiblepraxis.ai\/index.php?rest_route=\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/responsiblepraxis.ai\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/responsiblepraxis.ai\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=33"}],"version-history":[{"count":1,"href":"https:\/\/responsiblepraxis.ai\/index.php?rest_route=\/wp\/v2\/pages\/33\/revisions"}],"predecessor-version":[{"id":34,"href":"https:\/\/responsiblepraxis.ai\/index.php?rest_route=\/wp\/v2\/pages\/33\/revisions\/34"}],"up":[{"embeddable":true,"href":"https:\/\/responsiblepraxis.ai\/index.php?rest_route=\/wp\/v2\/pages\/9"}],"wp:attachment":[{"href":"https:\/\/responsiblepraxis.ai\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=33"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}