“We Need to Do Something to Fight the Online Strongholds of Radicalization!”

The extreme right-wing perpetrators of the attacks in Halle, Christchurch, and Poway used alternative online platforms to stream their attacks live. A study carried out by the Institute for Strategic Dialogue (ISD) and supported by the Robert Bosch Stiftung now shows the significance of these platforms for the spread of right-wing extremism and possible countermeasures. Author Julia Ebner explains how we can effectively restrict right-wing extremist content on the internet.

Alexandra Wolters | February 2020
Julia Ebner / Bildnachweis: Photo: Institute for Strategic Dialogue
Photo: Institute for Strategic Dialogue

Julia Ebner works for the London-based Institute for Strategic Dialogue (ISD) and specializes in right-wing extremism, reciprocal radicalization, and terrorism prevention. She also looks into online communications between far-right extremists. Ebner’s most recent publications include the books “Radikalisierungsmaschinen: Wie Extremisten die neuen Technologien nutzen und uns manipulieren” (Radicalization Machines: How Extremists Use New Technologies and Manipulate Us) (2019) and “Going Dark: The Secret Social Lives of Extremists” (2020).

What do you consider the most important findings of the study with regard to right-wing extremists on online platforms?

Firstly, we found that, of all the alternative platforms we investigated, the messenger service Telegram, the social network VK, the Twitter alternative Gab, and the YouTube alternative Bitchute play the largest role in mobilizing German-speaking extreme right-wing movements and activists. These correlate strongly with the preferred platforms of the English-speaking alt-right. Many of the posts we analyzed spread extreme-right ideologies and conspiracy theories via various methods. Even if they don’t explicitly incite violence, their content may contribute to the radicalization of users and inspire extremist acts of violence and terrorism. That makes them dangerous.

How do you come across these alternative online platforms with right-wing content on the internet that are so much smaller than Facebook and YouTube? 

It’s shocking how incredibly easy it is to find these platforms. They are on the surface web, which is accessible for everyone, and not on the dark web, as you might imagine. Many far-right influencers already have a prominent presence on major platforms. From there, they then guide their sympathizers to alternative platforms with extreme right-wing content or share links that take users to them. These include, on the one hand, alternative platforms that were created for ultra-libertarian purposes and far-right content, with the aim of attracting those types of users. But there are also platforms that were originally created for a completely different purpose, such as gaming. They have been co-opted by right-wing extremists and exploited for their own ends. At first glance, these channels still look like gaming sites, but they spread hoaxes and extreme right-wing ideologies.

Suspension is definitely useful, but it must be justifiable

What are the effects of suspending the accounts of right-wing extremists on major platforms such as Facebook? And how useful is this method?

In the last few years, we have seen a migration to smaller platforms when accounts on larger platforms are shut down. However, the study confirmed that the number of people migrating is significantly smaller. It’s much harder for right-wing extremists to generate large numbers of followers on alternative platforms. Suspending the accounts of right-wing extremist groups considerably limits the distribution and reach of their content. So suspension is definitely useful, but it must be justified by reasonable grounds. We must not overlook the potential negative implications, because suspensions may lead to more frustration and conspiracy theories.

How can we tackle far-right content and attempts at radicalization on alternative platforms?

To prevent people from becoming radicalized on these alternative platforms and using them for publicity, as we saw with the attackers in Halle and Christchurch, for example, we need new intervention models with deradicalization programs – such as Counter Conversations, for instance. This ISD pilot project focuses on bringing former members of right-wing extremist groups and people who have been victims of terrorist attacks into contact with users who have posted far-right content. We have seen that there is a certain willingness to engage in such deradicalization activities online, initially anonymously.

This poses a clear threat to our democracy

We could see a wide range of reasons that motivate users on the platforms we investigated, including general xenophobia and anti-Semitism, the desire to influence the political discourse, and simply fun and entertainment. The interventions and activities for these platforms and their users must therefore be varied as well. Even if the German Network Enforcement Act, or NetzDG, were extended to smaller platforms, it would still fall short in my view, as it is limited to removing content that is expressly illegal. But many posts containing conspiracy theories and extremist ideologies operate in a gray area and are therefore legal in the eyes of the law.

What additional action must urgently be taken by politicians, society, and companies?

Operators of alternative platforms that have been co-opted are often unable to prevent their site being “taken over” by right-wing extremists. This would require improved cooperation between mainstream and alternative platforms. We recommend the operators work together to allow alternative platforms to benefit from the technical capabilities of the major platforms, such as the ability to identify and remove violent posts. In terms of security policy, early warning systems must be developed to detect incitements to violence, specific threats, and planned terrorist attacks on alternative platforms in good time. This could be achieved using speech recognition algorithms, for example. It is also important to educate people about far-right extremists on the internet, the methods they work with, and the dangers they pose to users in terms of radicalization. This requires appropriate training programs primarily for teachers, parents, and youth workers.

What will happen if we cannot successfully combat right-wing extremist activities on alternative platforms? 

We need to do something to fight the online strongholds of radicalization to prevent attacks like the ones in Halle, Christchurch, and El Paso from happening again. Individual platforms such as 8chan may have been taken down, but there are enough alternatives that may inspire people to commit acts of violence. There is also the danger of political influence in the long term. People may launch campaigns, for instance, which could have an effect on the online discourse and voting behavior in elections. This poses a clear threat to our democracy.

The study

The Online Ecosystem of the German Far-Right

This report presents the findings of a research project of ISD’s Digital Analysis Unit about the alternative...

News and stories

You could also be interested in

Culture between Nile and Oasis

Over the past 15 years, the Robert Bosch Stiftung fellowship recipients have organized a wide variety of cultural projects in different regions of Egypt. Four cultural managers share the story of their adventures.

Read more

An Opportunity for Science

In times of the Covid-19 pandemic, science is taking center stage, and no political decision is made without an expert assessment. Communication scientist Mike S. Schäfer explains why this crisis is an opportunity for science.

Read more

How the Corona Pandemic Affects Our Work

With its contingency measures, the Foundation wants to help protect the health of the greater community and flatten the COVID-19 curve.

Read more