YouTube said on Wednesday that it was banning the accounts of several prominent anti-vaccine activists from its platform, including those of Joseph Mercola and Robert F. Kennedy Jr., as part of an effort to remove all content that falsely claims that approved vaccines are dangerous.
In a blog post, YouTube said it would remove videos claiming that vaccines do not reduce rates of transmission or contraction of disease, and content that includes misinformation on the makeup of the vaccines. Claims that approved vaccines cause autism, cancer or infertility, or that the vaccines contain trackers will also be removed.
The platform, which is owned by Google, has had a similar ban on misinformation about the Covid-19 vaccines. But the new policy expands the rules to misleading claims about long-approved vaccines, such as those against measles and hepatitis B, as well as to falsehoods about vaccines in general, YouTube said. Personal testimonies relating to vaccines, content about vaccine policies and new vaccine trials, and historical videos about vaccine successes or failures will be allowed to remain on the site.
“Today’s policy update is an important step to address vaccine and health misinformation on our platform, and we’ll continue to invest across the board” in policies that bring its users high-quality information, the company said in its announcement.
In addition to banning Dr. Mercola and Mr. Kennedy, YouTube removed the accounts of other prominent anti-vaccination activists such as Erin Elizabeth and Sherri Tenpenny, a company spokeswoman said.
The new policy puts YouTube more in line with Facebook and Twitter. In February, Facebook said that it would remove posts with erroneous claims about vaccines, including taking down assertions that vaccines cause autism or that it is safer for people to contract the coronavirus than to receive vaccinations against it. But the platform remains a popular destination for people discussing misinformation, such as the unfounded claim that the pharmaceutical drug ivermectin is an effective treatment for Covid-19.
In March, Twitter introduced its own policy that explained the penalties for sharing lies about the virus and vaccines. But the company has a five “strikes” rule before it permanently bars people for violating its coronavirus misinformation policy.
The accounts of such high-profile anti-vaccination activists like Dr. Mercola and Mr. Kennedy remain active on Facebook and Twitter — although Instagram, which is owned by Facebook, has suspended Mr. Kennedy’s account.
Misinformation researchers have for years pointed to the proliferation of anti-vaccine content on social networks as a factor in vaccine hesitation — including slowing rates of Covid-19 vaccine adoption in more conservative states. Reporting has shown that YouTube videos often act as the source of content that subsequently goes viral on platforms like Facebook and Twitter, sometimes racking up tens of millions of views.
“One platform’s policies affect enforcement across all the others because of the way networks work across services,” said Evelyn Douek, a lecturer at Harvard Law School who focuses on online speech and misinformation. “YouTube is one of the most highly linked domains on Facebook, for example.”
Understand Vaccine and Mask Mandates in the U.S.
-
- Vaccine rules. On Aug. 23, the Food and Drug Administration granted full approval to Pfizer-BioNTech’s coronavirus vaccine for people 16 and up, paving the way for an increase in mandates in both the public and private sectors. Private companies have been increasingly mandating vaccines for employees. Such mandates are legally allowed and have been upheld in court challenges.
- Mask rules. The Centers for Disease Control and Prevention in July recommended that all Americans, regardless of vaccination status, wear masks in indoor public places within areas experiencing outbreaks, a reversal of the guidance it offered in May. See where the C.D.C. guidance would apply, and where states have instituted their own mask policies. The battle over masks has become contentious in some states, with some local leaders defying state bans.
- College and universities. More than 400 colleges and universities are requiring students to be vaccinated against Covid-19. Almost all are in states that voted for President Biden.
- Schools. Both California and New York City have introduced vaccine mandates for education staff. A survey released in August found that many American parents of school-age children are opposed to mandated vaccines for students, but were more supportive of mask mandates for students, teachers and staff members who do not have their shots.
- Hospitals and medical centers. Many hospitals and major health systems are requiring employees to get a Covid-19 vaccine, citing rising caseloads fueled by the Delta variant and stubbornly low vaccination rates in their communities, even within their work force.
- New York City. Proof of vaccination is required of workers and customers for indoor dining, gyms, performances and other indoor situations. On Sept. 27, a federal appeals panel reversed a decision that paused a mandate that teachers and other education workers in the city’s vast school system will need to have at least one vaccine dose, without the option of weekly testing. City hospital workers must also get a vaccine or be subjected to weekly testing. Similar rules are in place for New York State employees.
- At the federal level. The Pentagon announced that it would seek to make coronavirus vaccinations mandatory for the country’s 1.3 million active-duty troops “no later” than the middle of September. President Biden announced that all civilian federal employees would have to be vaccinated against the coronavirus or submit to regular testing, social distancing, mask requirements and restrictions on most travel.
She added: “It’s not possible to think of these issues platform by platform. That’s not how anti-vaccination groups think of them. We have to think of the internet ecosystem as a whole.”
Prominent anti-vaccine activists have long been able to build huge audiences online, helped along by the algorithmic powers of social networks that prioritize videos and posts that are particularly successful at capturing people’s attention. A nonprofit group, Center for Countering Digital Hate, published research this year showing that a group of 12 people were responsible for sharing 65 percent of all anti-vaccine messaging on social media, dubbing the group the “Disinformation Dozen.” In July, the White House cited the research as it criticized tech companies for allowing misinformation about the coronavirus and vaccines to spread widely, sparking a tense back-and-forth between the administration and Facebook.
Dr. Mercola, an osteopathic physician, took the top spot in the Disinformation Dozen. His following on Facebook and Instagram totals more than three million, while his YouTube account, before it was taken down, had nearly half a million followers. Dr. Mercola’s Twitter account, which is still live, has over 320,000 followers.
YouTube said that in the past year it had removed over 130,000 videos for violating its Covid-19 vaccine policies. But this did not include what the video platform called “borderline videos” that discussed vaccine skepticism on the site. In the past, the company simply removed such videos from search results and recommendations, while promoting videos from experts and public health institutions.
Ben Decker contributed research.