YouTube is blocking anti-vaccine content to fight medical misinformation

Sep 29 2021, 9:34 pm

In case you didn’t know where YouTube stood on anti-vaccine content, their updated guidelines to fight medical misinformation will make it clear.

On Wednesday, September 29, the YouTube team announced news on how they’re going to continue to manage “harmful” anti-vaccine content on their platform.

YouTube said that the vaccines that are being administered have been “approved and confirmed to be safe and effective by local health authorities and the WHO.”

Since the start of the pandemic, community guidelines have been prohibitive against content that “promotes harmful remedies,” like drinking turpentine to cure diseases.

The platform has removed over 130,000 videos in the last year for violating its existing COVID-19 vaccine policies.

Now, YouTube will take down content that makes anti-vaccine statements. Firstly, content that claims vaccines are unsafe is not allowed. Videos that claim there are chronic side effects to vaccines, outside of what health authorities have explained as rare side effects, are prohibited.

Secondly, any content that claims that vaccines don’t work, either by reducing transmission or preventing infection, is also banned.

Finally, content that misrepresents the substances and ingredients of vaccines is not allowed.

These vaccine guidelines don’t just extend to COVID-19; they cover all vaccine misinformation.

YouTube channels and accounts may be terminated if they violate these guidelines on a regular basis, or even if a single serious abuse occurs.

Watch an explainer video on the new misinformation policies:

After Google, YouTube is the second largest search engine on the internet. There are more than two billion monthly logged-in users on the platform, consuming more than a billion hours of content each day.

Sarah AndersonSarah Anderson

+ News
+ Coronavirus