Accepted for/Published in: Journal of Medical Internet Research
Date Submitted: May 16, 2023
Date Accepted: Jul 20, 2023
Exploring YouTube’s recommendation system in the context of COVID-19 vaccines: A comparative analysis of video trajectories
ABSTRACT
Background:
Throughout the COVID-19 pandemic, there has been a concern that social media may contribute to vaccine hesitancy due to the wide availability of anti-vaccine content on social media platforms. YouTube has stated its commitment to removing content that contains misinformation on vaccination. Nevertheless, such claims are difficult to audit. There is a need for more empirical research to evaluate the actual prevalence of anti-vaccine sentiment online.
Objective:
This study examines recommendations made by YouTube’s algorithms in order to investigate whether the platform may facilitate the spread of anti-vaccine sentiment online. We assess the prevalence of anti-vaccine sentiment in recommended videos and evaluate how real-world users’ experiences are different from the personalized recommendations obtained by using synthetic data collection methods, which are often used to study YouTube’s recommendation systems.
Methods:
We trace trajectories from a credible seed video posted by the World Health Organization (WHO) to anti-vaccine videos, following only video links suggested by YouTube’s recommendation system. First, we gamify the process by asking real-world participants to intentionally find an anti-vaccine video with as few clicks as possible. Having collected crowdsourced trajectory data from respondents from (1) the WHO/United Nations (UN) system (N = 33) and (2) Amazon Mechanical Turk (N = 80), we next compare the recommendations seen by these users to recommended videos that are obtained from (3) the YouTube API’s RelatedToVideoID parameter (N = 40) and (4) from clean browsers without any identifying cookies (N = 40), which serve as reference points. We develop machine learning methods to classify anti-vaccine content at scale, enabling us to automatically evaluate 27,074 video recommendations made by YouTube.
Results:
We found no evidence that YouTube promotes anti-vaccine content: the average share of anti-vaccine videos remained well below 6% at all steps in users’ recommendation trajectories. However, the watch histories of users significantly affect video recommendations, suggesting that data from the API or from a clean browser does not offer an accurate picture of the recommendations that real users are seeing. Real users saw slightly more pro-vaccine content as they advanced through their recommendation trajectories, whereas synthetic users were drawn towards irrelevant recommendations as they advanced. Rather than anti-vaccine content, videos recommended by YouTube are likely to contain health-related content that is not specifically related to vaccination. These videos are usually longer and contain more popular content.
Conclusions:
Our findings suggest that the common perception that YouTube’s recommendation system acts as a “rabbit hole” may be inaccurate, and that YouTube may instead be following a “blockbuster” strategy that attempts to engage users by promoting other content that has been reliably successful across the platform.
Citation
Request queued. Please wait while the file is being generated. It may take some time.
Copyright
© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.