YouTube Cracks On QAnon
Conspiracy Theories, Offline Abuse
YouTube played a larger role in bringing QAnon from fringes to mainstream than other platforms.
On Thursday, YouTube became the latest social networking giant to take action to stop QAnon, the vast pro-Trump conspiracy theorist collective whose online delusions about a world-run cabal of satanic pedophiles spill over through offline abuse.
In a blog post, the organization announced that it was reviewing its hate speech and abuse policies to ban "material attacking an person or group of conspiracy theories used to excuse real-world violence." The new policy would ban material endorsing QAnon as well as similar conspiracy theories such as Pizzagate, which wrongly claims to be top democrats and Hollywood e.
Other social networks also took measures to curtail QAnon 's expansion, related to violence and vandalism incidents. Last week, Facebook hardened its QAnon-related guidelines and compared it to an extremely militant "militarized mass revolution." Several smaller sites, including Pinterest, Etsy, and Triller, reported new QAnon content controls this week.
Under the latest guidelines introduced by YouTube today, "material that attacks or harasses others by implying that they are involved" in a harmful theory like QAnon or Pizzagate would be prohibited. News analysis of these ideas and videos will also be permitted to explore theories without naming people or organizations.
The QAnon campaign began in 2017 when an anonymous user under "Q Clearance Nationalist" or "Q" started posting coded messages on 4chan, the famously poisonous message board, claiming to have sensitive information about a covert war between President Trump and a multinational pedophile cabal. QAnon adherents — known as "bakers" — started debating and decoding them in real time on sites like Reddit and Twitter, linking the dots to a new rebranding of centuries-old anti-Semitic stereotypes that wrongly accused influential Democrats, like Hillary Clinton and liberal financier George Soros, of pulling the strings on a worldwide plot for human slavery.
Few outlets played a bigger role than YouTube in bringing QAnon from fringes to mainstream. In the early days of the movement, QAnon followers produced YouTube documentaries offering an introductory crash course in the movement's core beliefs. The videos were shared on Facebook and other sites, often used to attract new hires. Some were seen millions of times.
QAnon supporters have started YouTube chat shows to explore new theory-related trends. Any of these outlets amassed vast crowds and made their owners excellent voices within the campaign.
"YouTube has a huge role in Q mythology," said Mike Rothschild, a debunker of conspiracy theory who writes a QAnon book. "There are big players in the Q community making videos on a regular basis, having hundreds of thousands of viewers and packaging their ideas in sleek animations that are a world away from straight-to-camera rambles so prominent in video conspiracy theory."
For years, YouTube has sought to curtail the dissemination of propaganda and conspiracy theories on its website, and tweak the suggestions algorithm that sent millions of viewers to what it called low-quality material. In 2019, the firm started demoting what it called "borderline content"—images that challenged the rules but didn't absolutely violate them — and reducing the exposure of such images in search results and recommendations.
The organization claims these improvements have lowered the number of views borderline material from recommendations by more than 70%, but this amount can not be reliably confirmed. YouTube also says that after the policy reform in 2019, across a collection of pro-QAnon outlets, the number of views from suggestions fell by over 80%.
Social media sites have come under criticism in recent weeks when Democrats accuse them of doing too little to combat the dissemination of right-wing propaganda, and Republicans, including President Trump, paint them when censorious threats to free speech.
YouTube, owned by Google, has remained largely out of the political fray, amid the massive success of the platform — users watch over a billion hours of YouTube content every day — and the surface of disinformation and conspiracy theories on the service. His chief executive, Susan Wojcicki, was not publicly attacked or had to report to Congress, unlike Twitter's Jack Dorsey and Facebook's Mark Zuckerberg.
Vanita Gupta, chief executive of the Civil and Human Rights Leadership Conference, a network of civil rights groups, lauded YouTube's ban on QAnon video.
"We applaud YouTube for banning this malicious and hateful material that targets people with conspiracy theories used to excuse offline abuse, particularly through efforts like QAnon," Gupta said. "These online material will contribute to real-world abuse, promoting hatred that harms whole societies."
Mr. Rothschild, QAnon researcher, predicted QAnon believers kicked off YouTube would find ways to distribute their videos on smaller platforms. He also warned that followers of the campaign were notorious for attempting to evade site bans, and that YouTube would have to be cautious to prevent them from restarting and attempting again.
"YouTube banning Q videos and suspending Q promoters is a positive move," he said, "but it won't be Q 's end.