Opinion

From Satirical Memes to Massacring Muslims: How the Dark Web Turns White Supremacists Into Terrorists

The far right has weaponized the alternative Internet culture that once reveled in transgressive memes and witty pop culture references. Their gamification of extremism, largely unnoticed and unchallenged, has lethal consequences

People mourn at a makeshift memorial site near the Al Noor mosque in Christchurch, New Zealand, March 19, 2019.
Vincent Thian/AP P

From incitement to violence camouflaged as LARPing (Life Action Role Playing) to livestreamed shootings, we have entered a new age of gamified terrorism.

The 28-year-old Christchurch attacker Brenton Tarrant was an active member of the far-right online community on 8chan, an imageboard forum with a culture of anonymity, trolling and ‘sarcastic’ racism that has become popular with white nationalists, neo-Nazis and alt-right sympathizers. 

>> Mein Kampf Meets Jihad: How Neo-Nazis Are Copying the ISIS Terror Playbook >> Behind The Far-Right Memes of Trump As A 'White Power Crusader' 

Those of us who know the early image boards of 4Chan from their teenage years in the 2000s and who laughed at transgressive memes and witty pop culture references now find these corners of the internet hardly recognizable today.

They have undergone a profound metamorphosis over the past few years: trolling has been politicized, and shitposting weaponized. Breaking taboos is no longer an end in itself. It has become a means to spread harmful ideas.

Unregulated social media platforms such as 8chan, VoatGab and Minds that present themselves as "free speech" alternatives to mainstream platforms serve today's far-right as online safe havens. We call these places on the internet "Dark Social" because of their obscure nature. Some of these virtual communities are so hard to monitor and track that the gamification of extremism happened largely unnoticed and unchallenged.

A frame from the video livestreamed by Brenton Tarrant showing him reaching for a gun in the back of his car before shooting dead 50 Muslims at prayer in mosques in Christchurch, New Zealand. March 15, 2019
AP

With virtually no content moderation, "Dark Social" platforms are rife with hateful content and violent far-right imagery that escapes policy scrutiny. The 8chan thread on which Tarrant posted about the attack contained everything from pictures of Hitler and swastikas to popular far-right memes like Pepe the Frog and pop culture references such as pictures of the Joker.

The Christchurch attack demonstrated how far-right extremists have weaponize internet culture and infrastructure to air their violent ideology and rally support from a virtual community. It confirmed that governments, security services and tech companies are unprepared to face the challenges of far-right mobilization on unregulated platforms like 8chan.

At the Institute for Strategic Dialogue we have been monitoring the meeting hubs for online sub-cultures over the past few years. We have watched individuals climb down the rabbit holes of radicalization in real time. Some individuals arrived "from normal backgrounds," as Brenton Tarrant says of himself in his manifesto, to find themselves being drawn into self-reinforcing extremist bubbles.

To us, the Christchurch attack was a shock but not a surprise. The question was not if but when and where these extremist hotbeds would inspire a real-world attack. 

Before Tarrant, other far-right extremists radicalized and shared sinister plans on alt-tech platforms. 

The attacker who rammed a van into a crowd in Toronto last April was part of the incel subculture, the online community of "involuntarily celibate" men brought together by their hatred of women. The Pittsburgh synagogue attacker Robert Bowers was a familiar presence on Gab, where he posted about his plans to carry out an attack.

This image shows a portion of an archived webpage from the social media website Gab, with a Saturday, Oct. 27, 2018 posting by Pittsburgh synagogue shooting suspect Robert Bowers.
,AP

Taking down a platform like 8chan may seem like an easy fix after the Christchurch shooting. But the reality is that the problem is not confined to niche internet sub-cultures anymore. 

While hateful enclaves on alt-tech platforms continue to escape policy scrutiny, far-right extremists also rely on mainstream social media platforms to spread their narratives, skillfully navigating the gray zones of legal frameworks and company policies.

Tarrant showed that live content moderation and visual pieces of content are examples of easily exploitable security loopholes on mainstream platforms. 

Tarrant posted a picture of his weapons on Twitter a few days prior to the attack. Facebook took down the attacker's live stream video and deleted his Facebook and Instagram account, but archives continued to spread across other platforms like YouTube and Twitter. "I’m not sure how this video was able to stream for [17] minutes," a Facebook source said. Within the first 24 hours Facebook removed 1.5 million videos of the attack.

When videos can turn violent in a split second, and with few mechanisms to stop someone from starting a livestream video, armies of content moderators are struggling to keep up. Meanwhile, far-right extremists have understood the potential of live streaming in generating sensationalism and making their actions go viral.

Ultimately, the attack exposed multiple overlapping problems. The bad news is: there is no quick fix to resolving any of these. Solutions that focus only on individual platforms are doomed to fail. 

 Front pages of Australia's major newspapers in Melbourne on March 16, 2019 reporting on the shooting attacks at two different mosques in Christchurch in New Zealand
AFP

We need wide-ranging responses that look at the entire media and tech ecosystem, from alt-tech platforms to social media giants. A broad, longer-term government response is needed to stem the growth of far-right ideology online and offline.

There is also good news. Governments in Europe have started to take steps to combat far-right extremism online. Germany adopted the NetzDG or "network enforcement" law which imposes fines on social media platforms if they do not remove hateful content. France is set to adopt a similar law in May this year. These changes, while welcome, remain largely reactive and do not apply to alt-tech platforms.

Identifying the most radicalized individuals on alt-tech platforms and denying them access could be a first step. Governments and tech companies need to work together to devise long-term solutions to fight far-right radicalization online, while strengthening civil society-led efforts to challenge the narratives that inspired the hateful attack in Christchurch.

Julia Ebner is a terrorism and extremism researcher, and author of The Rage: The Vicious Circle of Islamist and Far-Right Extremism (2017). She is Research Fellow at the Institute for Strategic Dialogue and Global Fellow at the Project for the Study of the 21st Century. Twitter: @julie_renbe

Cécile Guerin is a freelance French journalist based in London. She is an Associate at the Institute for Strategic Dialogue. Twitter: @CecileNGuerin