top of page

Security Brief: Extremism Week of October 4, 2021

Week of Monday, October 4, 2021 | Issue 54

Beatrice Williamson, Marina Amador, Extremism Team


Locked out of Facebook[1]


Date: October 4, 2021

Location: Global

Parties involved: Telegram; Facebook; Messenger; Whatsapp; Instagram; Signal

The event: On Monday, Facebook suffered a blackout that also affected Whatsapp and Instagram. The outage lasted approximately six hours and affected users across the globe.[2] The blackout was reportedly caused by an error during routine maintenance of Facebook’s data center network, which resulted in the expiry of all their Domain Name System (DNS) data, thereby locking Facebook and its users out.[3] During the outage, Telegram’s founder, Pavel Durov, claimed the encrypted instant messaging app Telegram registered 70 million new users, massively surpassing the platform’s daily growth rate norm.[4] The messaging app Signal also experienced an influx of millions of users.[5]

The implications:

  • The use of Telegram by extremists is very likely to increase as the blackout has likely caused long-lasting reputational damage. A large number of users are almost certainly enmeshed with Facebook services, relying on the company for news, communications, and social networks; these users are very likely to continue using the platform. However, the outage is likely to draw political and regulatory attention to the risks of Facebook’s monopoly-like hold over the social media sector and the impact this has on global society. The outage has likely provided critics of Facebook with both evidence of Facebook’s risks and the impetus to act. It is almost certain that Facebook will experience increased scrutiny both in government oversight and public debate, increasing the likelihood of extremists using other platforms with greater privacy, such as Telegram and Signal.

  • Previously, a significant drawback of Telegram for extremists was smaller audience sizes; however, the blackout increased the overall potential audience size, which is likely to continue to grow as users explore alternative platforms. It is very likely most users did not make the shift earlier because of WhatsApp's convenience and significantly larger user pool, whereas Telegram has typically been a more niche app with fewer users. The outage caused 70 million users to join Telegram which indicates this was likely the push many users needed to overcome the initial motivation barrier. If Facebook continues to experience public scrutiny, technical difficulties, and regulations, the migration from Whatsapp to Telegram will likely continue, as it will become incrementally more appealing as more contacts use the app. Once on Telegram, users are very likely to realize the benefits of its encryption services, and in light of increasing content moderation on mainstream platforms, extremists are very likely to continue the migration to Telegram.

  • The shift of extremist users from Facebook to Telegram is highly concerning as the security features and high level of encryption will almost certainly make detecting and monitoring extremist activity more difficult. This will almost certainly make the prosecution of extremists more complex as private chats are only stored on the device itself and can easily be erased by either the sender or recipient. Prosecutors will likely find it difficult to gather evidence of crimes, resulting in more extremists being free to continue spreading their ideology and potentially causing harm. As a result, extremists will likely feel emboldened as they believe they can act with greater impunity, leading to an expansion of activities.

Date: October 5, 2021

Location: Washington DC, USA

Parties involved: former Facebook product manager Frances Haugen; Facebook; US Congress

The event: On Tuesday, Haugen testified before the US Congress against Mark Zuckerberg‘s social media platforms. Her testimony included accusations that Facebook and Instagram prioritize financial gains over the safety of their users.[6] Haugen also stated that these networking sites’ engagement algorithms favor the spread of misinformation, hate speech, and violent content.[7] She stated that the company was partially responsible for the January 6 Capitol riots and was failing to prevent certain nations from spreading disinformation and spying on other countries via Facebook.[8] During the hearing, the former Facebook employee proposed recommendations to make the platform a safer space for its users, such as increasing the minimum user age for social media to 17 years old.[9] Additionally, she suggested amending Section 230 of the Communications Decency Act, which currently exempts social media platforms from liability for the content created and shared by their users.[10]

The implications:

  • Haugen‘s accusations have likely resulted in a large share of Facebook users losing trust in the platform. While it is very likely that the majority of them will continue to use Facebook services, some may start to become more critical of the information they encounter on it. Accordingly, any extremists still using Facebook to recruit, communicate and spread their ideology are likely to start using new techniques, such as more subtle messages to attract the attention of users without arousing suspicion. As this content will be more difficult for both users and moderators to detect, users are likely to be exposed to misinformation and extremist rhetoric if detection and reporting mechanisms remain insufficient.

  • Growing concern about issues mentioned in the hearing is likely to result in greater government attempts to control social media platforms. Some measures suggested by Haugen, such as holding these platforms accountable for the content to which their users are exposed, are likely to be adopted as they would likely improve networking sites' security. For instance, raising the minimum age for social network use would likely result in minors being less exposed to extremist content and therefore more protected against online radicalization, to which they are increasingly vulnerable.[11] However, modifications to these laws would likely generate clashes between advocates of increased online safety and those who defend freedom of expression and more lenient control measures. This would very likely trigger heated online discussions as well as protests across different countries that could potentially be exploited by extremist individuals and organizations and result in violent confrontations between members of both groups.

  • Facebook will likely strengthen its online security and misinformation detection mechanisms to respond to the increasing public disapproval generated by the whistleblowing. It is also likely that the company will adopt some of Haugen‘s recommendations in order to improve its performance. This would very likely improve the public perception of the company and increase the security of its users. However, it is unlikely that Facebook will make structural changes to its engagement algorithm, as its “reward engaging content” feature generates large economic benefits for the company. As a result, extremist activity that remains on Facebook will continue to exploit the algorithm which favors their dissemination of violence, hate, and misinformation online.

Specialty reports are designed to inform clients of existing and emerging threats worldwide. To defeat terrorists and individuals intent on harming, it is critical to understand and investigate them. We collect and analyze intelligence on terrorists and extremists, their organizations, individuals who are threats, and their tactics and attacks to develop solutions to detect, deter, and defeat any act of terrorism or violence against our client. We also conduct investigations to identify persons of interest, threats, and determine the likelihood of a threat and how to stop them. To find out more about our products and services visit us at counterterrorismgroup.com.

________________________________________________________________________ The Counterterrorism Group (CTG)

[1]Facebook” by Stock Catalog licensed under Creative Commons

[2] Facebook explains error that caused global outage, The Guardian, October 2021, https://www.theguardian.com/technology/2021/oct/05/what-caused-facebook-whatsapp-instagram-outage

[3] WHAT IT CAN LEARN FROM THE FACEBOOK OUTAGE, TechGenix, October 2021, https://techgenix.com/it-can-learn-from-the-facebook-outage/

[4] Telegram says it added 70m new users during Facebook outage, The Guardian, October 2021, https://www.theguardian.com/media/2021/oct/06/telegram-says-added-70m-new-users-during-facebook-outage

[5] Ibid

[6] Here are 4 key points from the Facebook whistleblower's testimony on Capitol Hill, NPR, October 2021, https://www.npr.org/2021/10/05/1043377310/facebook-whistleblower-frances-haugen-congress

[7] Ibid

[8] Key takeaways from Facebook’s whistle-blower hearing, The New York Times, October 2021, https://www.nytimes.com/2021/10/05/technology/what-happened-at-facebook-whistleblower-hearing.html

[9] Why whistleblower Frances Haugen is Facebook's worst nightmare, CNN, October 2021, https://edition.cnn.com/2021/10/06/tech/facebook-frances-haugen-testimony/index.html

[10] Whistleblower to Senate: Don't trust Facebook, Politico, October 2021, https://www.politico.com/news/2021/10/05/facebook-whistleblower-testifies-congress-515083

[11] Teen terrorism inspired by social media is on the rise. Here‘s what we need to do, NBC News THINK, March 2021, https://www.nbcnews.com/think/opinion/teen-terrorism-inspired-social-media-rise-here-s-what-we-ncna1261307


138 views
bottom of page