Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on October 6, 2020

Facebook and Instagram just banned all QAnon accounts, “even if they contain no violent content”


Facebook and Instagram just banned all QAnon accounts, “even if they contain no violent content”

Back in August, Facebook announced it would begin banning posts related to the QAnon conspiracy movement that supported acts of violence. Since then, the company has removed “over 1,500 Pages and Groups for QAnon containing discussions of potential violence and over 6,500 Pages and Groups tied to more than 300 Militarized Social Movements.”

But today, the company is taking one of its broadest moderation moves ever, banning “any Facebook Pages, Groups, and Instagram accounts representing QAnon, even if they contain no violent content.” The company says the decision comes about because calls for violence are just one form of “real world harm, including recent claims that the west coast wildfires were started by certain groups.” The company also cites the fact that “QAnon messaging changes very quickly and we see networks of supporters build an audience with one message and then quickly pivot to another.”

To be clear, the company is not banning individual posts, but rather accounts dedicated to spreading the conspiracy theory, according to a Facebook Spokesperson’s clarification to NBC News.

It can be hard to follow what QAnon ‘leaders’ are saying any given week, but the pro-Trump group’s conspiracies have largely revolved around the claim that “cabal of Satan-worshiping pedophiles running a global child sex-trafficking ring is plotting against President Donald Trump,” just to quote the Wikipedia page. As noted by the New York Times, sometimes adherents believe the cabal of leftist politicians and celebrities eat the children too. Even when the claims are less outlandish, QAnon groups can be particularly insidious in shaping the opinion of Facebook users through the use of memes.

Facebook notes that implementing the updated policy and removing accounts “will take time and need to continue in the coming days and weeks.” The company’s ‘Dangerous Organizations Operations’ team will work to proactively detect content before user reports; Facebook says these specialists are able to more accurately identify QAnon content than by sifting through user reports.

Facebook says it expects further attempts to avoid detection from QAnon, so it will continue to study the impact of its changes and implement further updates as necessary.

Via NBC News

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with