DON'T LOSE ACCESS:
Your IP access to ForeignPolicy.com will expire on June 15
.

To ensure uninterrupted reading, please contact Rachel Mines, sales director, at rachel.mines@foreignpolicy.com.

Argument

Self-Isolation Might Stop Coronavirus, but It Will Speed the Spread of Extremism

Millions of people stuck at home will turn to social media, where disinformation is rife. Radical Islamists and far-right groups are exploiting widespread confusion and fear to spread hate.

A member of the Iraqi forces walks past a mural bearing the flag of the Islamic State.
A member of the Iraqi forces walks past a mural bearing the flag of the Islamic State in Albu Sayf, Iraq, on March 1, 2017. Ahmad al-Rubaye/AFP/Getty Images
EDITOR’S NOTE: We’re making some of our coronavirus pandemic coverage free for nonsubscribers. You can read those articles here. You can also listen to our weekly coronavirus podcast, Don’t Touch Your Face, and subscribe to our newsletters here.
EDITOR’S NOTE: We’re making some of our coronavirus pandemic coverage free for nonsubscribers. You can read those articles here. You can also listen to our weekly coronavirus podcast, Don’t Touch Your Face, and subscribe to our newsletters here.

The COVID-19 crisis has ravaged many countries across the globe—and it has also presented an opportunity for extremist groups across the ideological spectrum to spread hate. As is often the case in times of uncertainty, extremists and terrorists have jumped at the chance to exploit confusion and fear, reach new audiences, and serve their own interests.

This is worrying for several reasons. In 2014, when the academic community was studying the effects of Islamic State propaganda on people’s willingness to travel abroad and join the conflict in Iraq and Syria, it was clear that the appeal of recruiters lay in a target audience’s need to understand their place in the world. As more information became available on those who joined terrorist organizations, or even those who committed terrorist attacks in their own countries, a common theme was the need to belong to an “insider” community and commit violence or destroy the ways of life of those who were part of the “outsider” community.

Those who hold such views are seen as intolerant by the majority, but often, they are able to appeal to a minority because they are charismatic or because, in times of uncertainty, their outlandish claims seem to make sense. So a 21-year-old British woman may become so enticed by the YouTube sermons of Anwar al-Awlaki, that she becomes galvanized to stab a member of Parliament because he voted for the Iraq War. The push factor driving the decision—loneliness and boredom—is then coupled with a pull factor: a charismatic recruiter who explains your duty to your Muslim brothers and sisters and just so happens to do so in English.

With the spread of COVID-19, people are being instructed to stay at home, and rightly so. Unfortunately, this risks increasing the consumption of fake news, conspiracy theories, and extremist material online, as people try to make sense of the crisis surrounding them. While governments have made major efforts to provide accurate information about COVID-19 online, there are two areas where social media companies will have to remain vigilant: the rise of conspiracy theories and the role this plays in calls to increase targeted violence against at-risk communities. The first often leads to the second.

[Mapping the Coronavirus Outbreak: Get daily updates on the pandemic and learn how it’s affecting countries around the world.]

The Henry Jackson Society, where I work, collaborates with social media companies to identify extremist videos and advise on the removal of online content. While the system has improved greatly since 2015—with the use of automated algorithms to recognize terrorist symbols, music, and content, meaning that 98 percent of this material is removed before a human can see it—companies still struggle with the so-called gray-area material where extremism thrives.

Tech companies still struggle with the so-called gray-area material where extremism thrives.This is because those who upload this content know exactly how to avoid violating terms and conditions that would mean their material is removed. So, for example, rather than denying the Holocaust ever happened—which would result in a video being removed from YouTube—a speaker might fudge the numbers in the video regarding how many people were killed to make it seem less of an atrocity and part of a wider conspiracy theory manufactured by the Jewish people. (And they will often use a dog whistle such as “Rothschild” or “skypes” to refer to Jewish people so a computer can’t pick it up.)

In the midst of the COVID-19 outbreak, a scan of YouTube reveals a spike in doomsday, crusader, and jihadist videos featuring “Mahdi,” a messianic deliverer anticipated to appear before the Day of Judgment and rid the world of evil. For Islamist extremists, this rhetoric is coupled with long lectures featuring speakers discussing how COVID-19 is a punishment from Allah. The Islamic State even urged its followers not to “enter the land of the epidemic.”

Don't Touch Your Face podcast Listen on Apple podcasts Listen on Spotify

Listen Now: Don't Touch Your Face

A new podcast from Foreign Policy covering all aspects of the coronavirus pandemic

Some activists on the Islamist side explain that coronavirus is an American and Jewish plot to reduce the world population, or that Jews are more dangerous than coronavirus, AIDS, and cholera combined. In addition to stirring up hate, religious videos often feature dangerous health misinformation; posts uploaded on social media in Iran this month show people licking holy shrines in defiance of COVID-19, ostensibly to “protect future pilgrims” to the shrines from contracting the virus.

Health disinformation is an important gray area currently not covered by social media policy. Therefore, users can argue that COVID-19 is the result of homosexuality or blame other ethnic groups and religions to fit into broader existing conspiracy theories. Looking straight at the camera during a live Facebook session, New York-based political activist Bahgat Saber calmly and rationally called on Egyptians who have contracted coronavirus to exact revenge by intentionally infecting state officials and government employees.

Health disinformation is an important gray area currently not covered by social media policy. 

A recent FBI memo describes how far-right extremists are urging the spread of coronavirus to Jews. Messages disseminated among the groups encourage their members to use “spray bottles” filled with infectious body fluids in order to attack police and to travel to any place Jews might congregate, including “markets, political offices, businesses, and places of worship.”

In the United Kingdom, a poster shared by the British National Socialist Movement on Facebook argues that if you get COVID-19, you should visit your local mosque or synagogue and spend time in diverse neighborhoods where “increased exposure to diversity is clinically proven to provide short-term and long-term benefits to immune system function.”

On Telegram, alt-right personality Milo Yiannopoulos asked his followers in a poll which was the “biggest hoax of our lifetime: Acid Rain, Climate Change, Satanic Ritual Abuse, Coronavirus.” Far-right groups have used the opportunity to peddle conspiracy theories and incite racial abuse against Asian diaspora communities. Research by the Anti-Defamation League, for example, found online cartoons depicting an Asian “Winnie the Flu” and violent imagery against Asians on extremist-friendly platforms Telegram, 4chan, and Gab. Disinformation campaigns are helped by such state actors as China—to deflect blame of government actions—and Russia, which is looking to increase divisions, sow distrust, and exacerbate crisis situations.

This harmful and hateful rhetoric against certain ethnic groups, religions, and communities is coupled with terrorist organizations issuing new orders to attack countries because they are preoccupied with COVID-19 concerns. For example, in the editorial of the Islamic State’s weekly newspaper Al-Naba, circulated last week, the group described the pandemic as a divinely ordained “painful torment” against “crusader” armies and security forces, which are overstretched to support their own efforts to deal with the pandemic. Given the group is restricting its deployment of jihadists abroad, it instructed supporters to take advantage of the epidemic to free prisoners from the “prisons of polytheists and the camps of humiliation.”

It is clear that as more people socially isolate—as they should—they will stay indoors and gain answers to their questions from information found on social media networks. It is therefore imperative that social media companies and governments act to shut down disinformation from across the ideological spectrum. Where content cannot be removed, it should be monitored. This will ensure that such material does not snowball into offline violence and acts, like it did with terrorism propaganda.

While much has been done to educate people about washing their hands, more must be done to counter extremists seeking to spread their hateful agendas. One way to address this quickly would be the creation of a new misinformation flag on YouTube and Facebook, which would allow users to pinpoint content that is factually incorrect or harmful. Moreover, lessons learned from using counternarratives to combat terrorism—where videos on the reality of Islamic State brutality ran alongside Islamic State propaganda—can also be used to extremist content that is spiking as a result of COVID-19.

A deliberate, coordinated, and proactive response from social media companies is the only way to ensure that those self-isolating at home will be prevented from being infected with disinformation.

Nikita Malik is director of the Centre on Radicalisation and Terrorism at the Henry Jackson Society in London.

Trending Now Sponsored Links by Taboola

By Taboola

More from Foreign Policy

By Taboola