YouTube Takes a Stand Against Misinformation

In an attempt to limit the spread of misinformation, YouTube has removed more than 130,000 videos with false information about the COVID-19, flu, HPV, and MMR vaccines.

In a blog post dated Sept 29, 2021, YouTube announced that they have expanded their medical misinformation policy as part of their ongoing work to “bring high-quality information” to their viewers. The new guidelines prohibit users from posting content that “includes harmful misinformation about currently approved and administered vaccines” regarding their safety, efficacy, and ingredients.

Entire accounts have been taken down, including the channels of Joseph Mercola and Robert Kennedy Jr., two prominent anti-vaccination activists of the Disinformation Dozen who are responsible for leading the bulk of vaccine misinformation campaigns globally shared on social media platforms.

The new guidelines will continue to allow content related to public discussion, a debate of the scientific process, vaccine policies, new vaccine trials, historical vaccine successes and failures, and personal testimonials, “so long as the video doesn’t violate other Community Guidelines, or the channel doesn’t show a pattern of promoting vaccine hesitancy.”

Unlike Facebook which banned vaccine misinformation in February and Twitter which followed a similar policy in March, YouTube has escaped scrutiny in the public discourse during the COVID-19 pandemic. Being the second largest search engine after Google, YouTube has been an essential instrument that widely distributes public health news and allows users to discuss content in the comments section.

The spread of misinformation can have detrimental effects on public health policies, potentially influencing people to make behavioral decisions that put the collective health at risk, such as ignoring recommendations by centers for disease control, reducing people’s intent to get vaccinated, or increasing distrust in science.

The misinformation crisis can be more amplified when it spreads in homogeneous, self-contained bubbles such as YouTube channels that spread anti-vaccination claims in the absence of accurate information. Users may develop a false perception that the misinformation shared in the video is accurate, further reinforced and validated by affirmative voices in the comments section.

In a 2021 study, researchers in Germany and UK refer to this pattern as informational homogeneity, “the extent to which uniform types of information are connected to each other.” Users in these cocoons of misleading content interact with like-minded viewers, with limited access to contradictory information. The study highlights how the communication landscape is fragmented into subgroups that are homogenous in terms of the content they share and discuss. Fragmentation can have a positive effect on the dissemination of relevant information but carries the risk of polarization in groups that spread misinformation, conspiracy theories, or extreme ideologies. Informational homogeneity in fragmented subgroups sharing misinformation disconnects members from sources of accurate information.

YouTube’s recommendation algorithm leads users who initially followed a video with false information to more videos of similar content, confining them to a rabbit hole of predominantly inaccurate content. The algorithm ‘learns’ from videos watched, click-rates, average time spent watching the video, and engagement (liking, disliking, commenting) to tailor recommendations for users. This maximizes user retention and generates advertising profit for YouTube. The algorithm does not consider content accuracy as long as the video is experiencing engagement. The algorithmic bias propagates misinformation due to the personalized suggested videos based on the user’s activities.

With more than 2 billion users every month and over 1 billion hours of video uploaded every day, YouTube plays an important role in distributing news during crises by acting as a channel for health authorities to disseminate scientific messages to the public about the current state of the pandemic and the important preventive measures. During the COVID-19 pandemic, the world has also experienced an infodemic, as termed by the World Health Organization, with a massive abundance of misinformation, disinformation, conspiracy theories, and propaganda on social media platforms that have derailed the efforts of public health professionals in controlling the pandemic. YouTube’s ban on anti-vaccination content is one late step forward in fighting the wave of misinformation.

Elissar Gerges
Contributor at The Commoner | + posts

Elissar Gerges has more than 10 years of experience as an AP and IBDP Biology teacher and Biology head of the department. She holds a Master of Science in Education from Walden University, a Master of Education in Curriculum Studies and Teacher Development from the University of Toronto, and a Doctor of Education (EdD) in Educational Leadership from Western University, Canada. Elissar’s research focus is on learning communities, team leadership, instructional leadership, and integrating citizenship in science education. She is a strong advocate of science media literacy to enable all students, as active citizens, to critically evaluate science in the media to make informed decisions.

Share on social media
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x