Less than 10 days until TNW València 🇪🇸 Get a last-minute 30% discount on your ticket

This article was published on December 18, 2017

Twitter’s latest white nationalist purge is predictably unpredictable [Update]

Twitter’s latest white nationalist purge is predictably unpredictable [Update]
Bryan Clark
Story by

Bryan Clark

Former Managing Editor, TNW

Bryan is a freelance journalist. Bryan is a freelance journalist.

After announcing a new set of rules meant to scrub the site of “hateful imagery and display names,” and those who use a “username, display name, or profile bio to engage in abusive behavior” back in November, Twitter is now ready to enforce them.

It started in predictable fashion. The platform today suspended the accounts of several well-known white nationalists.

Among these suspensions were the incendiary far-right group Britain First, an account you may have heard of after President Trump retweeted three of its videos in November. The group is known for its pro-nationalist stance and inflammatory view on Muslims, facts be damned. Snopes describes the group as one that “has a penchant for sharing out-of-context, misleading, and false information.”

The individual accounts of its two leaders, Jayda Fransen and Paul Golding, were also suspended.

While multiple other white nationalist figureheads and publications were given the boot, the purge doesn’t seem to offer the consistency Twitter users may have hoped for.

Prominent white nationalist Richard Spencer, for example, was not suspended. Nor was former grand wizard of the KKK David Duke — although Duke is reporting that some of his posts are hidden behind the “sensitive material” warning. Curiously enough, his “It’s Ok To Be White” message appears in the header image, but is censored in his timeline behind a sensitive material warning.

We’ve reached out to Twitter for comment and will update this post should someone reply.

Update 2:06pm

A Twitter representative responded to say the company isn’t commenting on individual account suspensions, but that it would not allow groups [or individuals, presumably] that:

  • identify through their stated purpose, publications, or actions, as an extremist group
  • have engaged in, or currently engage in, violence (and/or the promotion of violence) as a means to further their cases
  • target civilians in their acts (and/or promotion) of violence

The representative stated the behavior Twitter is looking for when determining whether an account is affiliated with a violent extremist group includes:

  • stating or suggesting that an account represents or is part of a violent extremist group
  • providing or distributing services (e.g., financial, media/propaganda) in furtherance of progressing a violent extremist group’s stated goals
  • engaging in or promoting acts for the violent extremist group
  • recruiting for the violent extremist group

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with

Back to top