Follow Us:
Tuesday, January 18, 2022

Explained: Why YouTube has blocked anti-vaccine content

So far, YouTube has already removed over 130,000 videos for violating Covid-19 vaccine policies.

By: Explained Desk | New Delhi |
October 2, 2021 2:35:50 pm
Silhouettes of laptop and mobile device users are seen next to a screen projection of the YouTube logo in this picture illustration taken March 28, 2018. (Reuters Illustration: Dado Ruvic)

On Wednesday, YouTube announced it would be expanding its medical misinformation policies with new guidelines on vaccines, which include vaccines that work against Covid-19 as well as general statements about other vaccines as well.

The move comes amid criticism that social media platforms are not doing enough to tackle misinformation related to Covid-19.

What kind of misinformation is YouTube targeting?

According to the new policy that came into effect on Wednesday (September 29), any kind of content that says approved Covid-19 vaccines cause autism, cancer or infertility, or claims substances in vaccines can track those who receive them, will be removed.

Further, content that falsely alleges that approved vaccines are dangerous and cause chronic health effects, or claim that vaccines do not reduce transmission or contraction of disease, or contains misinformation on the substances in the vaccines will be removed.

In a blog post, YouTube said its Community Guidelines already prohibit certain types of medical misinformation, including content that promotes harmful remedies, like information that drinking turpentine can cure diseases.

Since the pandemic began, the platform has already been targeting Covid-19 and medical misinformation. So far, YouTube has already removed over 130,000 videos for violating Covid-19 vaccine policies.

What kind of content is treated as misinformation by YouTube?

When it comes to Covid-19 related content, YouTube treats the following as misinformation:

  • Content that encourages the use of home remedies, prayer or rituals in place of medical treatment such as consulting a doctor or going to a hospital
  • Content that claims that there’s a guaranteed cure for Covid-19
  • Content that recommends use of Ivermectin or Hydroxychloroquine for the treatment of Covid-19
  • Claims that Hydroxychloroquine is an effective treatment for Covid-19
  • Categorical claims that Ivermectin is an effective treatment for Covid-19
  • Claims that Ivermectin and Hydroxychloroquine are safe to use in the treatment Covid-19
  • Other content that discourages people from consulting a medical professional or seeking medical advice

One of the possible reasons for YouTube’s decision to expand its misinformation campaign is vaccine hesitancy, especially across the United States.

As per a survey conducted by the Pew Research Center, Democrats in the US are far more likely than Republicans to have received at least one dose of a Covid-19 vaccine. This survey also states a person’s vaccination status is strongly linked with confidence in the vaccine research and development process.

About 81 per cent of the respondents of this survey said they didn’t know if there were serious health risks from Covid-19 vaccines, and 80 per cent said public health officials were not telling them everything they know about these vaccines.

Newsletter | Click to get the day’s best explainers in your inbox

📣 The Indian Express is now on Telegram. Click here to join our channel (@indianexpress) and stay updated with the latest headlines

For all the latest Explained News, download Indian Express App.

  • Newsguard
  • The Indian Express website has been rated GREEN for its credibility and trustworthiness by Newsguard, a global service that rates news sources for their journalistic standards.
  • Newsguard
0 Comment(s) *
* The moderation of comments is automated and not cleared manually by