Tackling Online Radicalization At Its Offline Roots

We can't successfully combat radical right extremism by only targeting their propaganda. We have to tackle the societal roots of the problem.
Left: The August 2017 Charlottesville “Unite the Right” Rally (Photo Credit: Anthony Crider/Creative Commons). Right: Jayda Fransen and Paul Goulding, leaders of the far-right group Britain First, at a rally in London on November 4, 2017. (AP)

Left: The August 2017 Charlottesville “Unite the Right” Rally (Photo Credit: Anthony Crider/Creative Commons). Right: Jayda Fransen and Paul Goulding, leaders of the far-right group Britain First, at a rally in London on November 4, 2017. (AP)

How does the radical right recruit young people online? How do social media platforms play a role in this? And, how do we respond to online hate? CARR Policy & Practitioner Fellow, and Regional Coordinator for the UK’s Countering Violent Extremism Programme (Prevent), William Baldet takes a look at the potential rabbit holes for young people online and the need for addressing the root causes of radicalization in order to mitigate its harmful effects.

You would be forgiven for thinking that the advent of Covid-19 and the accompanying lockdowns have led to the mass radicalization of our youth, consigned to cyberspace, and exposed to vast swathes of online terrorist propaganda. It’s true that cases of ‘self-radicalization’ in the virtual world do occur, but we must move away from the narrative that passive exposure to extremist content is creating an army of Gen-Z terrorists (lest it become a self-fulfilling prophecy).

Instead, we need to understand the wider political and cultural ecosystem that extremists see themselves on and how the relationship between offline contexts and assertive online strategies are being deployed to draw young people from more innocuous environments and then down the radical right rabbit hole.

In the online space, attempts to play ‘whack-a-mole’ with online terrorist content, while positive and necessary, can only address the tip of an increasingly amorphous iceberg. It’s a strategy largely confined to mainstream platforms – whose willingness to oblige can be inconsistent – and so misses the content sitting untouched (on dark web/social media platforms) in marginal corners of the internet and social media.

Moments like these require unrelenting truthtelling. We take pride in being reader-funded. If you like our work, support our journalism.

Algorithmic Radicalization? A Journey Down the Online Radical Right Rabbit Hole

Most of the extreme right-wing propaganda is designed to saturate younger consumers in alternative sub-cultures of disturbing humor, racist memes, and pervasive ‘dog whistles’ to white supremacist tropes, most of which would not attract the attention of the algorithms employed to identify terrorist content. In fact, the central algorithmic premise of mainstream platforms is to serve up more of the same to its users, therefore propagating problematic content and thus increasing exposure to it.

A fledgling journey through the looking glass starts with influencers (such as Mark Collett, Black Pigeon Speaks, or Brittany Sellner) on mainstream platforms discussing topical issues of great import to their young audiences: self-improvement, dating, gaming, and pop culture. Platforms, such as YouTube, Tik Tok, Facebook, Instagram, Twitter, and Snapchat, all become unwitting participants as these commentators attract a huge fanbase that ranges from hundreds of thousands of followers to in excess of a million.

Some influencers, fully aware of the eagle-eyed content moderators with their fingers poised over the ‘delete’ button, are careful to curate seemingly benign and charismatic videos to entice their audiences. Woven into their content are sly nods and dog-whistles to the politics of white nationalism and white supremacy intended to bait their acolytes to interpret the subtexts and pollinate the comments sections – mostly unmoderated – with explicitly extreme content.

It is here that signposts to the deeper recesses of their twisted ideologies lurk. For example, a video that makes oblique references to the possibility of a sinister Jewish elite will trigger anti-Semitic hyperbole within the comments beneath it where loyal agitators post hashtags and hyperlinks to marginal platforms where more extreme content is hosted. Some are unencumbered by moderators while others offer the protection of encryption. Some, of course, provide both. While writing this article, for example, I found myself lurching from far-right anti-Semitic conspiracies on YouTube to a Telegram channel hosting the Halle shooter’s horrific livestream.

Message and imageboards are especially popular, where memes trivializing violent racism and neo-Nazism proliferate. Through a barrage of ugly satire and conspiracy theories anxieties about immigration become conspiracies of The Great Replacement, while historical revisions of World War Two morph into Holocaust denial. The use of humor makes the content appealing and infinitely shareable, and so the consumers themselves become the purveyors, ‘swarming’ the propaganda to new networks and attracting fresh recruits.

From here, the rabbit hole deepens. These online forums provide in-group socialization, nectar to the isolated young people who frequent them, their inability to socialize in the physical world no longer a barrier in cyberspace. Links to pro-Nazi bloggers and terrorist manifestos are routinely shared as young people are introduced to an alternative world of enlightened, ‘red-pilled’ peers who bemoan the complacent, older generation that has failed young people and allowed a liberal, multicultural society to develop.

Their contempt for feminism as an emasculating conspiracy against men is reflected in the misogynistic language and jokes, with rape and violence against women a recurring theme. Apps (such as Telegram and Discord) provide secure chat rooms for extremist and terrorist content to be shared and posted, away from the monitoring and detection of content moderation.

Looking to make a difference? Consider signing one of these sponsored petitions:

Take Action To Protect Voting Rights With The ACLU Sign Now
Demand Equal COVID-19 Economic Support And Healthcare For African Americans Sign Now
Support The Switch To 100% Renewable Energy Sign Now
*Rantt Media may receive compensation from the partners we feature on our site. However, this in no way affects our news coverage, analysis, or political 101's.

The Solution? Inoculating Against Hate

There is no easy solution to this and old, offline techniques of disruption won’t work. The goalposts haven’t just moved, they are on a different pitch altogether. The street movements of old are largely becoming obsolete (or at least secondary) sites of mobilization and the stereotype of the thuggish, far-right football hooligan has been replaced with the internet-savvy loner using bandwidth not protest to propagate hate and connect with like-minded individuals. The swarm nature of memetic warfare means identifying and arresting their authors is a Sisyphean task; even if it were possible, the content very often falls below the criminal threshold, making prosecution problematic.

Since the content is not illegal, we must find ways to disrupt its influence. In 2014, as the Leicester Prevent Strategy Coordinator, I commissioned a series of school lesson plans that taught young people the mechanics of conspiracy theories and the tactics of online coercion and recruitment. The goal was to create a psychological ‘immunity’ to online conspiracies and misinformation.

More recently, academia has suggested prebunking (or ‘attitudinal inoculation’) as a way to tackle misinformation. They argue that just as administering a weakened dose of a virus triggers antibodies in the immune system to fight off future infection, pre-emptively exposing people to weakened examples of common techniques that are used in the production of fake news would generate ‘mental antibodies’.

Other academics have focused on the ‘Psychology of Misinformation’ and suggest that the skills to resist fake news and conspiracies are inherent within us, they just need amplifying: skepticism, alertness, analytical thinking, friction, inoculation, and nudges (prompting people to consider accuracy before sharing content) should therefore be promoted as a way of steering young people away from the rabbit hole.

These attempts to tackle the mechanics of the problem are positive and worthwhile but are treating the symptoms of the problem, not the cause. As a practitioner in countering violent extremism, I have learned to listen to those we have drawn away from the brink of radicalization. A fusion of offline socio-psychological factors (e.g. deprivation, self-esteem, and mental health issues) coalesce with online peer networks that foments their worldview, providing ethical frameworks and perceptions of reality, that are then reflected back to them in the cultural, moral, and political environment that they find themselves in.

Conclusion

Instead of online radicalization being simplified as exposure to terrorist propaganda, we must recognize it as a symbiosis between the relentless normalization and dissemination of extremist ideologies with the real-world anxieties of a disaffected, socially isolated cohort of online consumers who find purpose and belonging in the identity politics of the radical right. They are not passive consumers but active participants who feel displaced and disenfranchised by today’s political and cultural polarization.

We can (and should) increase pressure on social media companies to moderate their content, educate young people to navigate online hazards, and employ Countering Violent Extremism (CVE) programmes to pick up the pieces when radicalization occurs, but online harms have offline roots and we need to look beyond the mechanics of the problem to a far more fundamental solution that targets the causes of this extremism in the first place. We need to reconcile the tensions between the Left who feel resentful of their losses at the ballot box and the Right who feel embittered by their diminished cultural influence; if Left and Right can’t police their own fringes and embrace a common ground, and if society continues its current polarized trajectory, then the bewilderment felt by both sides will escalate and drive people towards ever more toxic and nihilistic extremes.

This article is brought to you by the Centre for Analysis of the Radical Right (CARR). Through their research, CARR intends to lead discussions on the development of radical right extremism around the world.

Rantt Media and ZipRecruiter


Opinion // CARR / Extremism / Radical Right / White Supremacist Terrorism / White Supremacy