I Spent A Month In Radical Right Social Networks. Here’s What I Learned.

Rantt Writer Luke Huizenga took a deep dive into social networks that cultivate extremism and became consumed in a whirlpool of disinformation and hate.
Proud Boys in Raleigh, North Carolina – November 28, 2020. (Anthony Crider, CC BY 2.0, via Wikimedia Commons)

Proud Boys in Raleigh, North Carolina – November 28, 2020. (Anthony Crider, CC BY 2.0, via Wikimedia Commons)

This is the follow-up to an investigative piece I wrote in February about disinformation and hate speech on alt-tech social media platforms. The original story can be found here. Due to the nature of certain content in this article, some sources have been left anonymous in order to avoid spreading further hate or expose individual identities.

After my first story on alt-tech was published, several readers told me they’d like to see my research in a more narrative format. So, I’ve decided to recount my experience learning about each platform, which spanned about a month. This time, I’ll provide more details and focus on how I came to uncover the darker side of right-wing social media.

My research began on Gab, the conservative Twitter/Facebook hybrid. Upon account creation, Gab suggests a variety of groups to join. These are some of the biggest gathering places for fans of different topics. From recipes to music to games, all sorts of groups make this list. However, when I made my own account, one suggestion stood out to me: /g/The_Donald. It was the biggest group on the list, boasting over 300,000 members.

Before going any further, this group’s name requires some explanation. The website Reddit is a message board-style collection of online communities known as “subreddits”. Back in 2015, it housed a subreddit called r/The_Donald. This was a place for Trump supporters to gather online. However, the subreddit was eventually banned in June 2020 due to consistent policy violations, including a steady stream of racism, misogyny, anti-Semitism, and other offensive content. When I saw the group entitled “/g/The_Donald”, I put two and two together.

/g/The_Donald is unsurprisingly similar to r/The_Donald. Among others, typical posts include COVID-19 and vaccine skepticism; Islamaphobic and xenophobic rants; and numerous conspiracy theories. Beyond its content, though, is the fact that Gab directly promotes /g/The_Donald. They suggest joining it as soon as someone signs up, clearly no accident.

Graphic of Hitler found in /g/The_Donald

Graphic of Hitler found in /g/The_Donald

After I had finished setting up my account, one of the first posts in my feed was a video from Gab founder and CEO Andrew Torba welcoming new users onto the site. While I was watching, though, something in the Recommended Episodes section caught my eye–a video called “Rise of the JEWnicorn”.

The video, it turned out, was of a guy in a bunny costume waving around a poster denying the Holocaust. The video included a link to an alt-right website rife with anti-Semitic content. I don’t believe Torba had any intention of relating this video with his own. However, it bears noting that Holocaust denial was attached to one of the most viewed videos on the platform and not some random account with three followers.

At this point, most of my work was simply a matter of making connections. By establishing a growing list of problematic channels, I was able to find reams of hateful videos on Gab TV. Instead of showing some of the more disturbing content, I’ll include one example that represents the general anti-Semitic attitude found on the platform. The video is titled “Ding-Dong the Jew is Dead”. In case it’s unclear, the original image of Ruth Bader Ginsburg–who was Jewish–has been replaced with one of Anne Frank.

Moments like these require unrelenting truthtelling. We take pride in being reader-funded. If you like our work, support our journalism.

Moving on from Gab, the next platform I looked into was MeWe. Of the site’s problematic content, disinformation was easily the most prevalent. My first move was to gauge MeWe’s level of moderation, and something I immediately noticed was the lack of attention to anti-vaccine posts. The number of groups and pages condemning vaccines far eclipsed those promoting them.

Thousands of users were flocking to these communities. I understood this better after reading a statement by CEO Mark Weinstein, who claimed that social media platforms had no responsibility to filter content that didn’t contradict MeWe’s Terms of Service. He specifically referenced conversations about health and medicine in this context.

It bears noting that several of these groups also regularly post other conspiracy theories–everything from the “stolen” election to the dangers of 5G. Bill Gates and Speaker Nancy Pelosi (D-CA) make frequent appearances. This, too, is accepted and not taken down. By permitting these groups to stay online, MeWe also allows for an even greater spread of disinformation.

Ironically, these are some of the same problems Facebook is dealing with, and MeWe has long heralded itself as the “anti-Facebook”. In a sentiment similar to Weinstein, Mark Zuckerberg has said, “There is a fine line between an important level of high energy around an important issue and something that can kind of tilt over into causing harm.”

After tracking MeWe conspiracy theories, I made an account on Telegram, a private messaging app that’s incredibly popular among extremists and the radical right. I joined a wide variety of group “channels” that discussed anti-Semitic, racist, neo-Nazi, and conspiracy topics. They reveled in violence and bigotry, sharing offensive memes and describing the things they’d like to do to Jews, African-Americans, and Asians––among others. Like Gab, it was easy to find this content. The only “research” I had to do to access these groups was to search through a directory with certain keywords. What took more time was compiling the evidence and trying to find connections that would lead to the particularly explicit channels.

One of the main reasons Telegram does such a good job at harboring extremism is how it spreads information. Similar to Facebook’s “Share” or Twitter’s “Retweet,” users have the ability to forward a specific message from one channel to other channels. If people like what they see, they can follow the forwarded link to the original channel. So, when one channel shares a racist meme and some of its members forward the meme to other channels, the popularity of said channel grows a little more. Because Telegram has a much smaller user count than Facebook or Twitter, communities are likelier to interact with each other, spreading content at a faster pace than typical social media platforms.

Once I understood this feature, I slowly accumulated a growing number of hateful and dangerous channels. Conspiracies and fake news were awash as well. What was so striking was the sheer lack of moderation. Granted, every so often, I came across messages that said they were removed for violating Telegram’s Terms of Service.

However, upon reading said Terms of Service, I realized any rigorous moderation was virtually nonexistent. Beyond a specific privacy policy, Telegram only has three rules: Do not scam users, do not promote violence, and do not share illegal pornography. The latter two rules don’t actually apply to private channels, but that’s not what I was interested in. The promotion of violence was publicly accessible.

Something I noticed over time was that several channels often linked their content to Gab. One example: a neo-Nazi group from Australia. It just so happened that they had their own group account. Granted, there is a significant Nazi presence on the platform to begin with, so this came as no surprise––I had actually found a number of Nazi profiles while researching Gab. They were not as systemic as on Telegram, but unchecked Nazi content was online nonetheless.

The last social media network I looked into was BitChute, the radical right’s version of YouTube. What is so dangerous about BitChute is the way it smoothly draws users into its more extreme content. At first glance, BitChute portrays itself to be a conservative platform truly giving users their First Amendment rights. However, this is a surface-level analysis. What BitChute excels at is sticking its more extreme content sparsely within the mass of more popular content.

If someone were to visit the homepage, they would likely find a wide variety of conspiracy theories, as well as monologues about gun control, abortion, and socialism. Every so often, though, another kind of video will appear on “Popular” or “Trending” lists. QAnon “evidence”, anti-Semitic stories, and misogynistic rants will appear. Once a person starts watching these videos, more suggestions crop up promoting similar topics. Eventually, the echo chamber is such that it could be easy to forget that BitChute presents itself simply as an alternative to YouTube.

Popular QAnon channel on BitChute’s homepage.

Popular QAnon channel on BitChute’s homepage.

This channel regularly posts videos sexually objectifying women. It has over twenty thousand subscribers.

This channel regularly posts videos sexually objectifying women. It has over twenty thousand subscribers.

Similar to Gab, I found myriad links to BitChute in Telegram channels, typically based on racist or Nazi content. One video, in particular, stuck out to me. The channel I found it in was actually the same Australian neo-Nazi group mentioned earlier. The video “starting your own crew” featured infamous neo-Nazi Rob Rundo giving tips to aspiring Nazis on how to create and grow their very own supremacist groups. In it, he shares stories of founding his first club, the Rise Above Movement. I was struck by how casual he sounded, something that was clearly intentional. He makes it plain that one of his main goals is to normalize white supremacy, and the more groups––big or small––that spread across the U.S., the sooner that will happen.

I realized that BitChute was no better than Telegram when it came to removing this kind of content. It fell under the umbrella of “free speech” and therefore they had no obligation to censor it. BitChute was yet another harbor for hate and disinformation.

By now, this was a familiar experience, and I became even more convinced that alt-tech social media, no matter what good intentions there might have been, was enabling incredibly harmful content to flourish without consequence. If there is actually a real desire from any of these sites to keep users safe and well-informed, I have been unable to find it.

In perfect honesty, my dive into the dark corners of internet extremism left me incredibly jaded about the social and political divides in our world. I’ll never forget some of the images and phrases I discovered. Before beginning my research, I knew I was in for something powerful and disturbing, but what I didn’t consider was how much it would affect me. I will never look at social media the same way. My experience has left me with doubts about unchecked free speech on social media. As a journalist, such questions are troubling, to say the least. However, I am certain of one thing: online hate still prospers. This is not the end of my investigation.

Rantt Media and ZipRecruiter


News // CARR / Disinformation / Extremism / Radical Right / Social Media