Accepted for/Published in: Journal of Medical Internet Research
Date Submitted: Sep 4, 2022
Date Accepted: Jan 30, 2023
Date Submitted to PubMed: Feb 3, 2023
One year of COVID-19 vaccine misinformation on Twitter
ABSTRACT
Background:
Vaccinations play a critical role in mitigating the impact of COVID-19 and other diseases. Past research links misinformation, including that which spreads on social media, to increased hesitancy and lower vaccination rates. Gaps remain in our knowledge on the main drivers of vaccine misinformation on social media and effective ways to intervene.
Objective:
This study explores COVID-19 vaccine misinformation circulating on Twitter during 2021, when vaccines were being released to the public in an effort to mitigate the global pandemic. Our work studies the prevalence of information originating from low-credibility news websites and YouTube videos, and identifies the main spreaders of vaccine misinformation.
Methods:
We collected almost 300M English-language tweets related to COVID-19 vaccines using a list of over 80 relevant keywords over a period of 12 months. We then extracted and labeled news articles at the source level, based on third-party lists of low-credibility and mainstream news sources, and measured the prevalence of different kinds of information. We also considered suspicious YouTube videos shared on Twitter. To identify spreaders of vaccine misinformation, we focused on verified Twitter accounts and employed a bot detection algorithm to identify accounts that are likely automated.
Results:
Our findings show a low prevalence of low-credibility information compared to mainstream news. However, most popular low-credibility sources had reshare volumes comparable to many mainstream sources, and larger volumes than authoritative sources such as the U.S. Centers for Disease Control and Prevention and the World Health Organization. Throughout the year, we observed an increasing trend in the prevalence of low-credibility news relative to mainstream news about vaccines. We also observed a considerable amount of suspicious YouTube videos shared on Twitter. We found that tweets by a small group of about 800 “superspreaders” verified by Twitter accounted for approximately 35% of all reshares of misinformation on the average day, with the top superspreader (@RobertKennedyJr) being responsible for over 13% of retweets. We also found that low-credibility news and suspicious YouTube videos were more likely to be shared by automated accounts.
Conclusions:
The broad spread of rumors and conspiracy theories around COVID-19 vaccines on Twitter during 2021 shows that there was an audience for this type of content, possibly fueled by distrust towards science and governments. Our findings are also consistent with the hypothesis that superspreaders are driven by financial incentives that allow them to profit from health misinformation. Despite high-profile cases of deplatformed misinformation superspreaders, our results show that in 2021 a few individuals played an outsize role in the spread of low-credibility vaccine content. As a result, social media policies should consider revoking the verified status of repeat-spreaders of harmful content, especially during public health crises.
Citation
Request queued. Please wait while the file is being generated. It may take some time.
Copyright
© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.