Are Telegram and Signal Havens for Right-Wing Extremists?

The best model for tackling violent right-wing groups on heavily encrypted apps is the fight against the Islamic State.

Protestors inside the U.S. Capitol.
Supporters of U.S. President Donald Trump protest inside the U.S. Capitol in Washington on Jan. 6. ROBERTO SCHMIDT/AFP via Getty Images

Since the violent storming of Capitol Hill and subsequent ban of former U.S. President Donald Trump from Facebook and Twitter, the removal of Parler from Amazon’s servers, and the de-platforming of incendiary right-wing content, messaging services Telegram and Signal have seen a deluge of new users. In January alone, Telegram reported 90 million new accounts. Its founder, Pavel Durov, described this as “the largest digital migration in human history.” Signal reportedly doubled its user base to 40 million people and became the most downloaded app in 70 countries. The two services rely on encryption to protect the privacy of user communication, which has made them popular with protesters seeking to conceal their identities against repressive governments in places like Belarus, Hong Kong, and Iran. But the same encryption technology has also made them a favored communication tool for criminals and terrorist groups, including al Qaeda and the Islamic State.

The surge in new users was not completely due to the major platforms’ crackdown on right-wing groups. WhatsApp’s new privacy policy, which users incorrectly concluded would allow the Facebook subsidiary to share their data even more widely, was at least as relevant in shifting users to Telegram and Signal. Regardless of the reasons, however, both apps are swelling in popularity—which raises troubling questions about whether these platforms will be even more exploited by hate groups, extremist organizations, and other nefarious actors who are no longer welcome on other platforms. Although denying extremist groups a public social media presence would be a win, their migration to encrypted apps would raise concerns that secret communications could facilitate further violence.

To see why many are alarmed, we don’t have to look far back. In 2015, Telegram saw growing use by the Islamic State. Previously, the group had relied on Twitter and Facebook to recruit new members, coordinate activities, and promote its ideology. After the two public platforms finally banned Islamic State content and began taking it down, the group turned to Telegram and similar services to facilitate communications, including recruitment and planning terror attacks.

Telegram’s coordination with law enforcement in response to its use by the Islamic State offers a blueprint for countering the platform’s extreme right-wing groups, including insurrectionists. Even though Telegram lacks the ability to monitor encrypted private channels, it eventually removed nearly all Islamic State accounts. This step drastically curtailed the Islamic State’s ability to use Telegram’s encrypted messaging. Along the way, the company—which long rejected cooperating with law enforcement—developed a collaborative relationship with Europol, the European Union’s transnational investigative force. Formally known as the European Union Agency for Law Enforcement Cooperation, Europol investigators encouraged Telegram to remove Islamic State content without resorting to the heavy-handed legal measures against which the company has been so allergic.

Telegram offers users two levels of encrypted communication. Its services, which include groups, channels, and messages, can be either public (searchable by username) or private (by invitation only). Although Telegram itself can theoretically view public services, secret chats are encrypted end to end. This means that no one—not even Telegram executives—can view these private messages. To make communication even more secure, Telegram offers users the option to set their messages to self-destruct after they are read so any record of the communication disappears into the ether. While Telegram’s terms of service ban the promotion of violence, they are only enforceable in connection with publicly viewable channels.

Governments around the world pressured Telegram to block Islamic State content and cooperate with investigations into terror suspects.

Telegram’s unregulated nature sparked fears when Islamic State members migrated to the platform en masse after they were banned from Twitter in 2015. Observers of online Islamic State activity reported that the group openly used Telegram’s public services to spread propaganda, recruit new members, and deliver instructions for carrying out attacks on specific targets. Islamic State members also communicated with one another in secret, encrypted chats. One of the perpetrators of the 2015 Paris terror attacks had downloaded Telegram on his phone on the day of the rampage and may have used the service to coordinate the operation. After the attack, the Islamic State used Telegram to claim responsibility. Similarly, following the 2016 truck attack of a Berlin Christmas market, the Islamic State not only used Telegram to claim credit but also to release a video where the perpetrator vowed to kill Westerners. Likewise, the Islamic State gunman who carried out the 2017 attack on the Reina nightclub in Istanbul received instructions from a Syrian member of the group via Telegram.

In 2016, Durov said he was horrified by terrorist activity on his platform and that he was trying to prevent it, but he reminded authorities that the platform’s technology made it impossible for him to reveal the encryption key for any private chats even if he wanted to. He also pointed out that if he banned the Islamic State from Telegram, they could start their own messaging app “within a month or so.” Durov ominously added, “This is the world of technology, and it’s impossible to stop them at this point.”

Governments around the world pressured Telegram to block Islamic State content and cooperate with investigations into terror suspects using the platform. In Washington, members of the U.S. House Foreign Affairs Committee declared that “no private company should allow its services to be used to promote terrorism and plan out attacks that spill innocent blood.” French authorities obtained a court order to try to force Telegram to provide information about terrorist activity on the platform. After an Islamic State attack in St. Petersburg in 2017, the Russian government threatened to ban Telegram.

Durov continued to reject any sharing of confidential user data. His reluctance to cooperate with law enforcement, even in investigations of terrorism, may stem from his experience with authorities in Russia. In 2006, Durov founded VKontakte, Russia’s version of Facebook, but he left the company and Russia after refusing to hand over user data to the Kremlin or block the accounts of opposition activists. Durov wrote that “Telegram has never yielded to pressure from officials who wanted us to perform political censorship” and is wary of law enforcement. This posed a quandary for those seeking to counter the Islamic State’s reach through Telegram: How could they convince the company, at the very least, to remove public content from the terrorist group when Telegram was fundamentally opposed to government interference and censorship?

The solution came from an unlikely source: Europol’s Internet Referral Unit. Following the 2015 Paris attacks, Europol began to flag extremist Islamic State content to Telegram when it was posted on the company’s public channels. Europol did not try to subpoena user data or demand the removal of content. It simply brought the Islamic State’s promotion of violence on the platform to the company’s attention.

According to Europol officials, this opened a channel of communication between law enforcement and company executives, which helped build trust. Telegram, perhaps realizing it had a reputational issue and risked more drastic steps by governments, started to take responsibility for moderating terrorist content on its platform. Key to this relationship was the fact that Europol did not have any legal mandate over Telegram. The Internet Referral Unit’s sole purpose was to flag terrorist content and work with social media companies to raise awareness about various terrorist groups’ online activities. Europol left the final decision about whether to remove content to Telegram itself.

The purge of the Islamic State from Telegram crippled the terrorist organization’s ability to spread propaganda and secretly communicate.

The relationship between Europol and Telegram started to pay dividends. Telegram began maintaining an “ISIS Watch” that posted daily updates about terrorist content banned from the platform. In December 2016, Telegram announced it was routinely removing 70 Islamic State channels per month. Finally, in late 2019, Telegram and Europol conducted a joint operation that deleted 43,000 terrorist-related user accounts, effectively removing the Islamic State from the internet. This campaign was uniquely effective. Telegram had previously only deleted channels and groups, which Islamic State members could easily recreate. The platform ultimately doubled down by deleting the accounts of users within those channels and quickly blocked any attempts of the group’s supporters to create new accounts.

The purge of the Islamic State from Telegram crippled the terrorist organization’s ability to spread propaganda and secretly communicate within the platform’s encrypted functions. Although Telegram could never access the private communications of Islamic State members, it could identify violent extremist users based on their presence in channels that posted content violating its terms of service, which in turn provided an avenue to scrape users from the platform. In this way, Telegram proved it could deftly balance its commitment to privacy while rejecting violent extremist content.

Since the right-wing insurrection on Jan. 6 that left five people dead at the U.S. Capitol, there are now similar concerns in the United States that violent extremist groups will exploit Telegram to secretly coordinate attacks. Since the attack, Telegram has already shut down what it says are “dozens” of channels calling for violence against the U.S. government or advocating for ethnic cleansing, neo-Nazism, or guerrilla warfare.

Parallels between the Islamic State and violent U.S. extremist groups are not perfect, of course. Domestic insurrectionist militias have not killed thousands of civilians nor are they universally considered terrorist organizations in the same manner as the Islamic State. Telegram is probably more hesitant to remove content from accounts spreading “stop the steal” slogans or calling for the overthrow of the U.S. government, not least because these ideas are tacitly supported by a major U.S. political party. Durov would certainly face accusations of censorship from right-wing activists if he followed the example of Twitter or Facebook with their blanket removal of such channels.

There’s no reason why they can’t similarly work together against the Proud Boys, Oath Keepers, and other extremist groups today.

However, the handling of the Islamic State still provides valuable lessons. First, it demonstrates that Telegram, if it so desires, can monitor and remove violent content without compromising its encrypted messaging functions. When a user publicly advocates for violence or joins a channel promoting violence, Telegram can also remove that user’s account, in which case they lose platform access and the ability to communicate via private chats. But the case also shows the limitations of patrolling Telegram: Islamic State members who were careful to never publicly post about violence or join terrorist-linked channels—and only communicated via private chats—might still be using the platform undetected today, for all we know. The future of right-wing extremist groups on Telegram likely depends on whether they publicly advocate violence or restrict their plotting to private communications. Of course, extremists who stick to private communications will find it harder to get attention and recruit new members, naturally reducing their reach.

The case of the Islamic State also reveals how law enforcement agencies can form an effective partnership with Telegram in spite of the company’s distrust of governments and vigorous view of free expression. Europol’s collaborative approach allowed it to gain Telegram’s trust and foster cooperation. Ideally, a U.S. equivalent of Europol’s Internet Referral Unit could develop a working relationship with Telegram to facilitate limiting the reach of U.S. extremist groups. This could serve as a model of cooperation not just with Telegram but with Signal and other encrypted apps to stem violent content and domestic threats.

Law enforcement agencies already have a successful track record working with tech companies to blunt the reach of the Islamic State. There’s no reason why they can’t similarly work together against the Proud Boys, Oath Keepers, and other extremist groups today. Past experience should also calm fears that after their removal from Twitter and Facebook, extremists will now be even more effective at plotting schemes out of view. The case of the Islamic State suggests instead that collaborative partnerships between law enforcement and Telegram—as well as other encrypted apps like Signal—can be effective in driving violent extremists from the internet and quashing their public calls for violence.

Steven Feldstein is a senior fellow in the Carnegie Endowment for International Peace’s program on democracy, conflict, and governance.

Sarah Gordon is a research assistant in the Carnegie Endowment for International Peace’s program on democracy, conflict, and governance.

Trending Now Sponsored Links by Taboola

By Taboola

More from Foreign Policy

By Taboola