top of page


Marina Amador, Lydia Pardun, Extremism Team

Week of Monday, October 4, 2021


Extremists and terrorists are increasingly utilizing video games, gaming community platforms, and under-regulated social media outlets such as TikTok to recruit new members and disseminate propaganda.[2] The dynamics of these platforms allow communities to connect with large audiences and easily share extremist narratives with other users. Extremist and terrorist actors utilize tailored video games, memes, inside jokes, and popular trends to radicalize individuals online.[3] Radicalization strategies have proven most effective among the youth, as they comprise the majority of video game and social media users.[4] Malicious individuals and organizations will very likely continue to exploit these platforms as their heavy reliance on user-led moderation and lack of reporting mechanisms offer an opportune space to interact with potential recruits and share extremist content. However, more effective content moderation mechanisms would likely reduce user exposure to extremist ideology and the potential for radicalization.

Social media outlets such as Tiktok are likely to be exploited by extremists to further engagement with their ideology. Tiktok’s structure for community-involvement allows users to build upon and expand content quickly and extensively. Extremists have taken advantage of Tiktok’s platform mechanics to spread white supremacy and neo-Nazism.[5] Tiktok’s algorithms almost certainly create echo chambers that likely repetitively expose users to extremist content, likely forming extremist social groups that reaffirm and promote members’ beliefs. Users very likely feel social pressure to be part of a said group, and will likely participate without critically evaluating their own content to feel accepted, further spreading extremist ideology.

Multiplayer video game communities allow users to create modifications (mods) to game mechanics that can then be shared with other users. Extremists have been using mods to design and spread roleplay simulations, such as concentration camps on Minecraft and a racing game on Roblox where the goal is to run over racial minorities.[6] Mods allow players to self-project into narratives created by extremists, likely leading to short-term subconscious adoption of radicalized morals that are very likely to become embedded and conscious the more an individual plays these simulations. More complex and realistic game mods can likely be employed by lone actors and extremist groups to plan and prepare for a terror attack. The sophistication of mods also provide extremists a medium to prepare for the mental aspects of an attack, likely resulting in more effective attacks and greater casualties.

As extremists continue to use video games and online platforms to disseminate their propaganda and recruit new members, the number of young people radicalized online is likely to continue to increase.[7] COVID-19 lockdowns have almost certainly favored this trend, as lack of face-to-face social interaction, disruption of daily routines, and extensive isolation are likely to have made youth more vulnerable to engagement with extremist content. Sharing memes and humorous content related to popular topics has been used as a tactic by extremists to interact with young people online.[8] The humorous component of these interactions has likely resulted in more expansive youth engagement with extremist narratives, as it provides entertainment during periods of social isolation. Humorous content further provides an effective tool for radicalization as extremist intentions can be veiled as friendly interaction. The more extensive engagement of youth with extremist content during COVID-19 lockdowns has almost certainly enabled extremists to improve online outreach and recruitment strategies. This will almost certainly result in extremists continuing to connect with vulnerable individuals by exploiting the common ground and sense of community offered by social media and video game platforms.[9] This strategy is likely to result in a greater likelihood of recruitment and radicalization as young people are more likely to engage in activities related to those they associate with leisure.

Growing concern over the use of online platforms to spread violent and extremist propaganda has led some sites to increase their efforts to counter these narratives.[10] For instance, Tiktok removed more than 300,000 videos related to violent extremism between January and March 2021.[11] However, heavy reliance on user-led moderation is very likely to continue fostering exploitation by extremists. Radicalized individuals are likely to use specific terms and images that average users might not be familiar with in order to bypass security measures, almost certainly resulting in a large portion of violent and extremist content going undetected. Platform efforts to ban and remove such content from servers will very likely be hindered by claiming efforts are a violation of freedom of expression. This will almost certainly encourage extremists to continue to use this technique, facilitating easier dissemination of propaganda. The majority of video games, gaming community platforms, and social networking sites also lack effective report mechanisms, causing extremist rhetoric to remain online for longer periods of time and increasing the likelihood of users being exposed to such content. Extremist individuals and organizations will almost certainly continue to use online communities for recruitment purposes as long as they can exploit vulnerabilities in security systems. Implementing rigorous identification procedures for accessing these platforms would likely result in a reduction of radicalization trends. Increasing the number of official content moderators and training them to detect different types of hateful content would very likely be another successful strategy, as it would enable a faster and more effective removal of extremist and terrorist content.

The Counterterrorism Group (CTG) is a subdivision of the global consulting firm Paladin 7. CTG has a developed business acumen that proactively identifies and counteracts the threat of terrorism through intelligence and investigative products. Business development resources can now be accessed via the Counter Threat Center (CTC), emerging Fall 2021. The CTG produces W.A.T.C.H resources using daily threat intelligence, also designed to complement CTG specialty reports which utilize analytical and scenario-based planning. Innovation must accommodate political, financial, and cyber threats to maintain a level of business continuity, regardless of unplanned incidents that may take critical systems offline. To find out more about our products and services visit us at


[1]TikTok” by TheBetterDay licensed under Creative Commons

[2] “Digital Challenges”, Radicalisation Awareness Network, 2021,

[3] “Extremists’ Use of Video Gaming – Strategies and Narratives”, Radicalisation Awareness Network, 2020,

[5] Extremist content is flourishing on TikTok: Report, Politico, August 2021,

[6] Extremists using video-game chats to spread hate, BBC News, September 2021,

[7] Teen terrorism inspired by social media is on the rise. Here‘s what we need to do, NBC News THINK, March 2021,

[8] “Extremists‘ use of gaming (adjacent) platforms - Insights regarding primary and secondary prevention measures”, European Commission, 2021,

[9] Is It Just a Game? Exploring the Intersection Between (Violent) Extremism and Online Video-Gaming, VOX-Pol, September 2021,

[10] Facebook tests extremist content warning messages, BBC News, July 2021,

[11] TikTok said it removed more than 300,000 videos in the first three months of 2021 for spreading 'violent extremism', Insider, June 2021,



bottom of page