Youtube blocks all anti-vaccine content, robert f kennedy jr and joseph mercola
YouTube is removing and blocking anti-vaccine content.
YouTube is taking steps to restrict and remove all content that promotes misinformation about immunizations for COVID-19 and other illnesses like as measles and chicken pox.
YouTube is taking steps to ban and remove all video that conveys disinformation regarding immunizations for COVID-19 and other illnesses, including measles, hepatitis, and chicken pox.
The Google-owned online video giant announced Wednesday that it will remove any content that "falsely asserts that licensed vaccines are hazardous or cause chronic health consequences."
"This includes content that falsely asserts that licensed vaccines cause autism, cancer, or infertility, or that vaccines contain ingredients that may track those who get them."

Google reports that it has removed 130,000 videos since 2020 for violating the company's COVID-19 vaccination standards, and that it is increasing up its efforts.
"We're expanding our YouTube policy against medical misinformation with additional recommendations on presently delivered vaccines that have been approved and confirmed safe and effective by local health authorities and the World Health Organization," the company said.
The company will remove individual videos from some users and — as first reported by the Washington Post — will completely deactivate the accounts of serial misinformation spreaders, including Joseph Mercola, an American physician with over half a million subscribers, and Robert F. Kennedy Jr., the son of the former presidential candidate and a vocal critic of vaccines.
All claims on vaccines that are currently being tested will remain permissible. Personal accounts of vaccine responses will also be authorized, as long as they do not originate from an account known for spreading vaccine disinformation.

Tim Caufield, the University of Alberta's Canada Research Chair in Health Law and Policy, told CBC News Wednesday that the move is long overdue.
To explain why it took so long, he explained, "the cynical reason is that these films are incredibly popular and do drive traffic." According to him, social media algorithms reward what garners attention, and "there is a wealth of studies indicating that misinformation gains significant momentum on these platforms — in fact, travels further and quicker than the truth."
The decision comes amid criticism that YouTube and other digital giants such as Facebook and Twitter are not doing more to prevent the spread of bogus health information on their platforms. Twitter has recently raised the amount of content it flags as misleading or "manipulated media," as well as the number of accounts it has temporarily or permanently banned.
Caufield warns that there is a risk that YouTube's action may merely add fuel to the fires of conspiracy theories promoted by several online purveyors of misinformation, alleging that large internet companies are attempting to stifle the truth. However, he believes the action will result in far more good than harm.
"It's unlikely to have as much impact on those adamant naysayers, but we know how tough it is to change their ideas in the first place," he said. "The moveable middle – those who are complacent or only slightly hesitant — should always be your objective."