YouTube says it will ban supremacist content and remove videos that deny well-documented atrocities, like the Holocaust and the massacre at Sandy Hook elementary school. The company says it will be removing hundreds of thousands of videos that hadn’t previously been considered in violation of its rules.
The move comes as the video service, owned by Google, faces increasing scrutiny for hosting extreme and divisive content.
In a blog post Wednesday, YouTube said it was prohibiting “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status.”
On Tuesday, however, YouTube said videos mocking Carlos Maza, a video producer for the website Vox, for his sexual orientation did not violate its policies. A series of videos posted by a right-wing commentator included calling Maza “an angry little queer.”
In a tweet to Maza, YouTube said, “while we found language that was clearly hurtful, the videos as posted don’t violate our policies.”
YouTube’s ban on supremacist content comes a few months after Facebook said it was banning white nationalist content from its platform. That ban came two weeks after the suspect in the terror attack at two New Zealand mosques streamed part of the massacre live on the platform. A manifesto allegedly written by the suspect revealed white nationalist views.
Last August, YouTube and other social media platforms banned the conspiracy theorist Alex Jones. Jones had previously pushed false conspiracy theories about the Sandy Hook school shooting. He’s being sued by the families of the victims for defamation.
YouTube has long faced criticism for allowing misinformation, conspiracy theories and extremist views to spread on its platform, and for recommending such content to users. It’s been shown that people who visited the site to watch videos on innocuous subjects, or to see mainstream news, have been given recommendations pushing them toward extreme content.
In January, YouTube said it would start reducing its recommendations of “borderline content” and videos that may misinform users in “harmful ways.” That included videos featuring fake miracle cures for serious diseases, claiming the earth is flat and making blatantly false claims about historic events such as 9/11, according to the company. The company said such content doesn’t violate YouTube’s community guidelines, but that it comes close.
By Donie O’Sullivan, CNN Business