YouTube said Wednesday it would remove videos that falsely claim approved vaccines are dangerous, as social networks seek to crack down on health misinformation around COVID-19 and other diseases.
Video-sharing giant YouTube has already banned posts that spread false myths around coronavirus treatments, including ones that share inaccurate claims about Covid-19 vaccines shown to be safe.
But the Google-owned site said its concerns about the spread of medical conspiracy theories went beyond the pandemic.
“We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general,” the Google-owned website said in a statement.
“We’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines.”
The expanded policy will apply to “currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the World Health Organization.”