YouTube continues to crack down on health misinformation, rolling out new guidelines this week around vaccines.
The company is now prohibiting YouTube content that includes misinformation about approved and administered vaccines. Specifically, the Google subsidiary said it will ban content that claims cleared vaccines have side effects outside of those recognized by health authorities, deny the efficacy of vaccines, or misrepresent the vaccine ingredients.
First-offenders will have their content removed and receive an email alerting them to the issue. Typically there is no penalty for the first offense. However, after three offenses within 90 days, the user’s channel will be terminated.
YouTube specified it may allow some content that violates this policy if it’s presented with additional context.
Want to publish your own articles on DistilINFO Publications?
Send us an email, we will get in touch with you.
“Additional context may include countervailing views from local health authorities or medical experts,” YouTube wrote on the updated policy page. “We may also make exceptions if the purpose of the content is to condemn, dispute, or satirize misinformation that violates our policies.
“We may also make exceptions for content showing an open public forum, like a protest or public hearing, provided the content does not aim to promote misinformation that violates our policies.”
WHY IT MATTERS
Public health officials are urging eligible people to get the COVID-19 vaccine. More than 43 million COVID-19 cases and nearly 700,000 COVID-related deaths have been reported in the U.S. since the start of the pandemic, according to the CDC. Today approximately 75.4% of eligible residents have at least one dose of the COVID-19 vaccine and 65% are fully vaccinated.
However, since the start of the pandemic, there has been a rise in misinformation related to COVID-19 and the corresponding vaccines. Numerous studies have linked social media to COVID-19 misinformation.
THE LARGER TREND
YouTube has been working on the medical misinformation question for some time. In early 2021, the company hired former U.S. Deputy Assistant Secretary of Health Dr. Garth Graham in an effort to push high-quality health content on its platform.
The video company has also teamed up with celebrities to talk about COVID-19 vaccines as part of a public service announcement collaboration with the Vaccine Confidence Project and the London School of Hygiene & Tropical Medicine.
In an August blog post, YouTube Chief Product Officer Neal Mohan said that removing videos that spread health misinformation isn’t enough to contain the problem. He said that YoutTube has already removed more than a million videos that spread COVID-19 misinformation.
In the blog post, he pitched boosting the amount of factual content and removing harmful videos. However, Mohan did question what aggressive removal of the posts could mean for freedom of speech.
Source: Mobihealthnews