IE 11 is not supported. For an optimal experience visit our site on another browser.

YouTube is purging all anti-vaccine misinformation. What took so long?

For years, YouTube executives have faced criticism for allowing conspiracy theorists and radical hatemongers to thrive on the platform.

YouTube said Wednesday that it’s purging videos and accounts that spread misinformation about Covid-19 vaccines — and all other vaccines for that matter.

In a blog post announcing the policy, YouTube said it will remove content that “falsely alleges that approved vaccines are dangerous and cause chronic health effects, claims that vaccines do not reduce transmission or contraction of disease, or contains misinformation on the substances contained in vaccines.” 

It's a welcomed but belated announcement from a company known for doing “too little, too late” to curb misinformation.

For years, YouTube executives have faced criticism for allowing conspiracy theorists and radical hatemongers to thrive on the site. A Bloomberg report from 2019 found executives knew YouTube was becoming a haven for incendiary conservative web hosts who shared conspiracy theories about everything from the 2018 Parkland school shooting to Hillary Clinton, but they were slow to act because the videos brought traffic to the site. 

YouTube’s decision to remove harmful anti-vaccine content is its strictest policy yet toward misleading, conspiratorial content. It's a welcomed but belated announcement from a company known for doing “too little, too late” to curb misinformation that has the power to kill and delude.

Already, accounts posting content from vaccine conspiracy theorists, including the prominent anti-vaxxer Robert F. Kennedy Jr., have been removed from YouTube in accordance with the policy. 

YouTube has removed over 130,000 videos “since last year” that violated its existing Covid-19 misinformation policies, the company said Wednesday.

Last month, YouTube said it had removed more than 1 million videos related to dangerous coronavirus misinformation since February 2020. That number includes prominent political figures as well, including Florida Gov. Ron DeSantis, a Republican. In April, YouTube removed a video from his channel for spreading misinformation about the efficacy of wearing masks to curb the spread of the coronavirus. And Sen. Rand Paul, R-Ky., faced a one-week suspension last month for floating similar theories about mask use on his channel.

Head over to The ReidOut Blog for more.

Related:

Turns out, Covid vaccine mandates work. Good thing more are on the way.

Just another reason to delete Facebook and Instagram for good

Federal judges say they ‘mistakenly’ broke the law in these cases. Yeah, right.