Alphabet Inc's YouTube said on Thursday it was banning content that targets an individual or a group using conspiracy theories such as QAnon or Pizzagate.
The move comes one week after Facebook and Instagram classified the QAnon conspiracy theory movement as dangerous and began removing Facebook groups and pages as well as Instagram accounts that hold themselves out as representatives.
The step escalates an August policy that banned a third of QAnon groups for promoting violence while allowing most to stay, albeit with content appearing less often in news feeds. Instead of relying on user reports, Facebook staff now will treat QAnon like other militarized bodies, seeking out and deleting groups and pages, the company said in a blog post.
Since the August restrictions, some QAnon groups have added members, and others used coded language to evade detection, for example referring to "cue" instead of Q. Meanwhile, adherents have worked to integrate themselves in other groups, such as those concerned with child safety and those critical of restrictions on gatherings due to the coronavirus, according to researchers at Facebook and elsewhere.
"While we've removed QAnon content that celebrates and supports violence, we've seen other QAnon content tied to different forms of real world harm, including recent claims that the west coast wildfires were started by certain groups," Facebook wrote.
"QAnon messaging changes very quickly and we see networks of supporters build an audience with one message and then quickly pivot to another."
Recent QAnon posts have spread false information about voting and about COVID-19, researchers said, even claiming that President Donald Trump faked his diagnosis of COVID-19 in order to orchestrate secret arrests.