The Growing Security Threat from Alternative Platforms

The Growing Security Threat from Alternative Platforms

Author: R.Y. Lazerson

The threat environment is increasingly dynamic and complex[1], due in-part to the diversification of online platforms used by extremist actors.[2] This Deep Dive blog post is part 1 in a series that examines the adoption of alternative platforms within the broader cross-platform operations of extremists and other threat actors. Twelve one-on-one interviews with experts and stakeholders provide the basis of this analysis.[3] 

Alternative platforms, sometimes called “alt-tech,” mimic many of the features of mainstream platforms but typically administer minimal content moderation and reject most collaboration with law enforcement. With minimal moderation, extremist operations find a more welcoming home, using the platforms to incubate disinformation, recruit, and mobilize action. For example, BitChute, a copy of YouTube, is rife with racist conspiracy theory videos,[4] and Gab, a knock-off of Twitter, gained notoriety for its use by the attacker of the Pittsburgh Tree of Life Synagogue.

Among alternative platforms, there is growing concern about extremist adoption of encrypted messaging applications, such as Telegram, a platform which mirrors many of the features of WhatsApp. With encryption, only undecipherable versions of messages are stored on servers. Furthermore, one-to-one chats on Telegram can use end-to-end encryption, meaning that only the sender and the recipient of a message can access it.

Considering extensive data collecting practices now common online, encryption offers an important privacy opportunity for the concerned average user. However, it is crucial to also examine how encryption is exploited by extremists as a shield. This shield can be particularly effective when an encrypted platform offers other complimentary policy and design features such as hands-off content moderation and file-sharing. A second deep dive in this blog series will use Telegram as a case study to explore how a platform’s varied characteristics can collectively afford unique opportunities for extremist organizing.

There is Significant Extremist Adoption of Alternative Platforms

Extremists have increased adoption of alternative platforms, in-part due to increased content moderation on mainstream platforms, such as Facebook and Twitter. Some alternative platforms were even explicitly created to provide space for individuals deplatformed and for content prohibited on more mainstream platforms.[5] 

Both qualitative signposts and quantitative study shows a migration of extremist actors and conspiracy theories to alternative platforms. For instance, QAnon, now banned from Facebook, has found a new home on Telegram.[6] Numerous overt neo-Nazi channels were also found on Telegram, some with tens of thousands of followers.[7] Similarly, the Proud Boys, a far-right organization with terrorist designations in Canada[8] and New Zealand[9], and banned from mainstream platforms, have established at least part of their operations on Telegram. There the Proud Boys have built a web of chapters and set up a home base for operations where they continue to grow.[10] From January 2021 to January 2022 Proud Boys chapters on Telegram jumped from under 20 to over 40.[11] Figure 1[12] shows how a researcher’s simple search of “proud boys” on Telegram easily revealed a list of channels. The screenshot displays only the top four results. In a sign of how emboldened they are on Telegram, the Proud Boys even publicly shared a list of their Telegram channels.[13]

While the adoption of alternative platforms among extremists can in some ways be a sign of the success of counter-extremism/disinformation efforts on mainstream platforms, it also points to new opportunities for organizing among extremist groups.[14] With minimal moderation, extremist ideologies and organizations are flourishing on alternative platforms in a vast web of interconnected and expanding networks.

Alternative Platforms Are Fundamental to Cross-Platform Operations

Extremists use different platforms in creative ways to extend the reach of their disinformation campaigns on mainstream social media platforms. Extremists thus use the same techniques employed by professional marketing firms: marketers use different tech platforms to serve unique roles in an advertising campaign.

For example, alternative platforms can play an essential role in the incubation of the most extreme conspiracy theories and dangerous disinformation narratives. An analysis of mis/disinformation in the 2020 election by the Center for an Informed Public (CIP) found that platforms were leveraged in distinct and complementary ways, with the intent to seed, amplify, and mobilize disinformation content.[15] For example, CIP found disinformation “megathreads” on Twitter, whereby disparate tweets and retweets, containing a mix of real and false information, were consolidated to form large-scale conspiracy theories. TikTok and Instagram were often used to reshare and further spread disinformation from other platforms. Meanwhile, CIP found that alternative platforms served as breeding grounds for more extreme narratives and encouragement of political violence, which were then leaked to larger platforms to maximize spread.[16]

Each extremist group’s contextual factors (e.g., ideology, goals, location, language) drive their selection of particular platforms and how the platforms are used. For example, one interview respondent[17] shared that alternative platforms, including Telegram, may currently play a more central role for far-right extremists than for example for Jihadi-Salafist extremists. In-part, this is a result of Facebook’s and Twitter’s greater ability to identify and moderate English (and other “western” language) content than other languages. Since far-right content and actors have been deplatformed at a greater rate from mainstream platforms, western far-right groups may have more of a need to relocate operations to alternative platforms.

Even within broader extremist ideologies, the choice and specific use of platforms can greatly differ among groups. For example, among far-right extremism, some groups may continue to use large mainstream platforms as initial recruitment spaces, from where they then invite users to smaller platforms for more intimate dialogue. Other far-right groups may operate almost entirely on alternative platforms, using larger spaces on Telegram, Discord or Signal for initial outreach before moving recruits into smaller spaces.[18] 

Some of the spread of incubated content from alternative to mainstream platforms is deliberate. For example, extremists use Telegram to rebroadcast media content “onto other messaging platforms and public-facing websites,”[19] wielding the platform as the staging grounds for a cross-platform effort.

Alternative Platforms Can Serve to Expedite Radicalization

There is also growing evidence that echo-chambers on alternative platforms may further radicalize those who migrated from mainstream platforms. Research at USC found that the more people are in echo chambers with a homogenous moral vision – a shared set of guiding principles – “the more likely they are to resort to radical means and verbal violence against others, aiming to achieve their prejudicial vision.”[20] Similarly, findings at the University of Warsaw suggest that with “frequent exposure to hateful online commentaries, people become increasingly desensitized to them, and ultimately the contents of these commentaries come to shape their perception of outgroup members (minorities, immigrant groups, political adversaries).”[21] 

In addition to the organic psychological radicalization that occurs in echo-chambers, an upcoming deep dive in this blog series will examine how extremist groups intentionally use platform features, such as Telegram’s varied chat sizes, to move new recruits through a funnel of increasingly radicalized smaller spaces.

Foreign Actors Use Alternative Platforms to Sow Discord in the US

Research also suggests that well-funded foreign actors are increasingly using alternative platforms to sow discord in the U.S. In 2020 an FBI investigation found that the Internet Research Agency (IRA), a notorious Russian troll farm, was behind a pseudo media organization, Newsroom for American and European Based Citizens (NAEBC), with accounts on both mainstream and alternative platforms.[22] NAEBC’s posts often focused on U.S. racial tensions and included negative portrayals of minority communities. Importantly, NAEBC attracted far greater numbers of followers on associated accounts on alternative platforms, Gab and Parler, than on mainstream platforms, Facebook, Twitter and LinkedIn.[23] While the mainstream platforms removed NAEBC after its identity was discovered, Gab and Parler did not.[24] Emblematic of alternative platforms’ rhetorical shield of protecting free-speech, Gab claimed that NAEBC did not violate the platform’s content policies, telling media that, “[NAEBC] looks like a blog sharing news stories and opinions. It’s irrelevant to us who runs it or why.”[25] There is skepticism about how impactful IRA’s various influence operations were in impacting broader election results or public opinion. However, IRA-linked campaigns have been found to organize and/or promote numerous in-person protests in the U.S., a discernible sign of the IRA’s intention and success in contributing to domestic polarization.

The IRA’s use of alternative platforms may mark a strategic shift in efforts by hostile foreign entities to stoke domestic U.S. tensions through microtargeting alt-tech echo-chambers for increased radicalization.[26] It stands to reason that the IRA would recognize the potential of microtargeting particular extremist channels on alternative platforms in order to sow discord. Past research of IRA Facebook and Instagram influence operation ads from the 2016 elections, found that the ads were often micro-targeted to specific demographics, with unique ad copy.[27] The apparent intention of the microtargeting being to stoke U.S. racial tensions. It is possible that the IRA is not only generally exploiting alternative platforms, as was clear with NAEBC, but that they would consider microtargeting particular extremist groups to sow discord in the U.S.

Looking ahead: Holistic threat assessments

The threat environment is increasingly dynamic and complex, due in-part to extremist and other threat actors’ adoption of diverse alternative platforms in their online cross-platform mobilization. With minimal to no content moderation, alternative platforms are uniquely suited to the incubation of conspiracy theories, recruitment, and extremist operations. Each threat actor’s contextual factors (e.g., ideology, goals, location, language) drive their selection of particular mainstream and alternative platforms and how they collectively use those platforms to organize online. Counter-disinformation and counter-extremism efforts should continue to expand the aperture of analysis, examining threat actors from a holistic cross-platform perspective.

The next deep dive in this series will introduce a matrix that can help researchers visualize extremist use of multiple platforms, and will use Telegram as a case study, providing an example of how a platform’s unique features provide distinct affordances for extremist cross-platform mobilization.


[1] “National Terrorism Advisory System Bulletin – June 7, 2022 | Homeland Security,” U.S. Department of Homeland Security, June 7, 2022, https://www.dhs.gov/ntas/advisory/national-terrorism-advisory-system-bulletin-june-7-2022.

[2] Erin Saltman, “Challenges in Combating Terrorism and Extremism Online,” Lawfare, July 11, 2021, https://www.lawfareblog.com/challenges-combating-terrorism-and-extremism-online.

[3] Interview respondents include senior academics with subject-matter expertise, think-tank specialists on extremism and disinformation, investigative researchers who focus on alternative platforms, government counter-disinformation practitioners, encryption technologists, tech employees, and other experts. Respondents have been anonymized in this blog.

[4] Milo Trujillo et al., “What Is BitChute? Characterizing the ‘Free Speech’ Alternative to YouTube” (arXiv, May 29, 2020), http://arxiv.org/abs/2004.01984.

[5] Abby Ohlheiser, “Banned from Twitter? This Site Promises You Can Say Whatever You Want,” The Washington Post, November 26, 2016, https://www.washingtonpost.com/news/the-intersect/wp/2016/11/29/banned-from-twitter-this-site-promises-you-can-say-whatever-you-want/.

[6] EJ Dickson, “QAnon Is In Crisis, But Telegram Channels Are Growing – Rolling Stone,” RollingStone, January 22, 2021, https://www.rollingstone.com/culture/culture-news/qanon-telegram-channels-increase-1117869/.

[7] “QAnon’s Antisemitism and What Comes Next” (Anti-Defamation League, September 17, 2021), https://www.adl.org/resources/reports/qanons-antisemitism-and-what-comes-next.

[8] Leah West, “The Complicated Consequences of Canada’s Proud Boys Terrorist Listing,” Lawfare, February 9, 2021, https://www.lawfareblog.com/complicated-consequences-canadas-proud-boys-terrorist-listing.

[9] “Designation of Two Terrorist Entities,” New Zealand Gazette, June 27, 2022, https://gazette.govt.nz/notice/id/2022-go2465.

[10] Interview respondent, interview by author.

[11] Interview respondent, interview by author.

[12] Tore Refslund Hamming [@ToreRHamming], “Finding Islamic State and Al-Qaida Telegram Channels Is Quite a Hassle. Believe Me, I’m Doing It a Lot. Yet Finding Right-Wing Extremist Channels Is Surprisingly Easy. And Note the High Number of Followers.,” Tweet, Twitter, April 21, 2022, https://twitter.com/ToreRHamming/status/1517029425627512832.

[13] Interview respondent, interview by author.

[14] Lena Frischlich, Tim Schatto-Eckrodt, and Julia Völker, “Withdrawal to the Shadows: Dark Social Media as Opportunity Structures for Extremism,” Connecting Research on Extremism in North Rhine-Westphalia, February 21, 2022, https://www.researchgate.net/publication/358742718_Withdrawal_to_the_shadows_dark_social_media_as_opportunity_structures_for_extremism.

[15] Center for an Informed Public et al., “The Long Fuse: Misinformation and the 2020 Election,” 2021, https://purl.stanford.edu/tr171zs0069.

[16] Ibid.

[17] Interview respondent, interview by author.

[18] Interview respondent, interview by author.

[19] Bennett Clifford and Helen Powell, “Encrypted Extremism: Inside the English-Speaking Islamic State Ecosystem on Telegram” (Program on Extremism – The George Washington University, June 2019), https://extremism.gwu.edu/sites/g/files/zaxdzs2191/f/EncryptedExtremism.pdf.

[20] Mohammad Atari, “Moral Echo Chambers Breed Radicalization,” Character Context Blog – SPSP (blog), February 2, 2022, https://spsp.org/news-center/character-context-blog/moral-echo-chambers-breed-radicalization. Research was conducted on alternative platform Gab and findings were replicated in a study of a Reddit subreddit called “Incels,” founded for so-called “involuntary celibates.”

[21] Wiktor Soral, Michał Bilewicz, and Mikołaj Winiewski, “Exposure to Hate Speech Increases Prejudice through Desensitization,” Aggressive Behavior 44, no. 2 (2018): 136–46, https://doi.org/10.1002/ab.21737.

[22] Zach Dorfman, “Russia Eyes Far-Right U.S. Social Media Networks,” Axios, October 7, 2020, https://www.axios.com/2020/10/07/russia-eyes-far-right-us-social-media-networks.

[23] Ibid.

[24] Jack Stubbs, “Exclusive: Russian Operation Masqueraded as Right-Wing News Site to Target U.S. Voters – Sources,” Reuters, October 1, 2020, https://www.reuters.com/article/usa-election-russia-disinformation-idUSKBN26M5OP.

[25] Ibid.

[26] Dorfman, “Russia Eyes Far-Right U.S. Social Media Networks.”

[27] Ahmed Al-Rawi and Anis Rahman, “Manufacturing Rage: The Russian Internet Research Agency’s Political Astroturfing on Social Media,” First Monday, August 16, 2020, https://doi.org/10.5210/fm.v25i9.10801.