Создать аккаунт
Главные новости » Эксклюзив » YouTube Is Banning All Content That Spreads Vaccine Misinformation
Эксклюзив

YouTube Is Banning All Content That Spreads Vaccine Misinformation

0
YouTube Is Banning All Content That Spreads Vaccine Misinformation



Enlarge this image


YouTube has announced immediate bans on false claims that vaccines are dangerous and cause health issues like autism, cancer or infertility.





Danny Moloshok/AP



hide caption



toggle caption


Danny Moloshok/AP



Untangling Disinformation
Just 12 People Are Behind Most Vaccine Hoaxes On Social Media, Research Shows

«We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines, the company said.


YouTube says it has already taken pages down


YouTube says it now bans videos that claim that vaccines aren’t safe or effective, or that they cause other health issues like cancer and infertility. In its announcement, the company pointed specifically to videos that inaccurately describe what ingredients are used in vaccines, as well as allegations that vaccines contain properties that can be used to «track those who receive them.

There are some exceptions: Users are still allowed to share content related to their personal experiences with the vaccine, but only if those videos adhere to the site’s community guidelines and the channel in question doesn’t routinely encourage «vaccine hesitancy.

The new mandate goes into effect immediately, and YouTube has already removed pages known for sharing anti-vaccination sentiments, like those belonging to prominent vaccine opponents Joseph Mercola, Erin Elizabeth, Sherri Tenpenny and Robert F. Kennedy’s Children’s Health Defense organization, CNBC reports.


The company says widespread enforcement will take time




Shots — Health News
For Some Anti-Vaccine Advocates, Misinformation Is Part Of A Business

But the company, which is owned by Google, warned that the more widespread removal of videos may take some time as it works to enforce the policy.

As big tech companies like YouTube and Facebook have tightened their restrictions regarding vaccine misinformation over the last year, many conspiracy theorists began migrating to other, less-regulated platforms. Rumble, another video-sharing site, has become a popular choice for far-right groups and others who are vaccine resistant, Slate reported in March.

But many conservative pages that spread vaccine misinformation are still active on YouTube and their videos continue to attract millions of views.

Editor’s note: Google is among NPR’s financial supporters.


  • health misinformation

  • coronavirus misinformation

  • vaccine hesitancy

  • anti-vaccine movement

  • misinformation

  • Vaccines

0 комментариев
Обсудим?
Смотрите также:
Продолжая просматривать сайт nrus.info вы принимаете политику конфидициальности.
ОК