YouTube's recommendation algorithm is left-leaning in the United States

Hazem Ibrahim, Nouar AlDahoul, Sangjin Lee, Talal Rahwan, Yasir Zaki

Research output: Contribution to journalArticlepeer-review


With over two billion monthly active users, YouTube currently shapes the landscape of online political video consumption, with 25% of adults in the United States regularly consuming political content via the platform. Considering that nearly three-quarters of the videos watched on YouTube are delivered via its recommendation algorithm, the propensity of this algorithm to create echo chambers and deliver extremist content has been an active area of research. However, it is unclear whether the algorithm may exhibit political leanings toward either the Left or Right. To fill this gap, we constructed archetypal users across six personas in the US political context, ranging from Far Left to Far Right. Utilizing these users, we performed a controlled experiment in which they consumed over eight months worth of videos and were recommended over 120,000 unique videos. We find that while the algorithm pulls users away from political extremes, this pull is asymmetric, with users being pulled away from Far Right content stronger than from Far Left. Furthermore, we show that the recommendations made by the algorithm skew left even when the user does not have a watch history. Our results raise questions on whether the recommendation algorithms of social media platforms in general, and YouTube, in particular, should exhibit political biases, and the wide-reaching societal and political implications that such biases could entail.

Original languageEnglish (US)
Article numberpgad264
JournalPNAS Nexus
Issue number8
StatePublished - Aug 1 2023


  • algorithmic bias
  • political radicalization
  • recommendation systems

ASJC Scopus subject areas

  • General


Dive into the research topics of 'YouTube's recommendation algorithm is left-leaning in the United States'. Together they form a unique fingerprint.

Cite this