Another day, another dead terrorist leader. This time, the dead terrorist guy was Taliban chief Mullah Akhtar Mohammad Mansour, killed by a U.S. drone strike in Pakistan, and everyone is cheering, because everyone knows that each dead terrorist leader is, well, another dead terrorist leader.
Mullah Mansour’s death is “an important milestone,” President Barack Obama declared on Monday morning. But it’s a milestone on the road to nowhere.
After all, we’ve passed this milestone several dozen times before. Let’s briefly recap:
- In April, U.S. and Kurdish forces killed Salman Abu Shabib al-Jebouri, a “senior” Islamic State leader in Iraq. In March, S. forces killed “top Islamic State commander” Abd al-Rahman Mustafa al-Qaduli in Syria. The United States also killed Islamic State “deputy” Haji Imam and “senior operative” Abu Omar al-Shishani in Syria that same month. In February, U.S. strikes killed Noureddine Chouchane, another “senior” Islamic State operative in Libya.
- In 2015, the United States managed to kill “senior al-Shabab leader” Abdirahman Sandhere in Somalia; militant chiefs Mokhtar Belmokhtar and Khawray Mehsud in Pakistan; Islamic State “No. 2” Fadhil Ahmad al-Hayali in Iraq; al Qaeda leader Nasir al-Wuhayshi in Yemen; and Ustad Ahmad Farooq, the deputy emir of al Qaeda in the Indian subcontinent.
- In 2014, U.S. strikes killed terrorist leader Ahmed Abdi Godane in Somalia, followed by al-Shabab intelligence and security chief Tahlil Abdishakur, as well as Nusra Front leader Abu Yousef al-Turki, who was killed by U.S. strikes in Syria, and Omar Farooq, a senior al Qaeda coordinator. The United States also successfully targeted Mufti Sofian and Abu Bakar, both senior commanders in the Afghan Taliban.
- In 2013, the United States killed Said al-Shehri, the second in command of al Qaeda in the Arabia Peninsula (AQAP). The Taliban’s original leader, Mullah Mohammed Omar, also bit the dust in 2013, though he reportedly died of a non-drone-related illness, and his death apparently made so little difference to the Taliban that no one even noticed he was dead until 2015.
- In 2012, U.S. strikes killed another “al Qaeda No. 2,” Abu Yahya al-Libi, in Pakistan; senior al Qaeda leader Sakhr al-Taifi in Afghanistan; AQAP leader Fahd Mohammed Ahmed al-Quso in Yemen; Taliban ally Badruddin Haqqani in Pakistan; and Jemaah Islamiyah leader Zulkifli bin Hir in the Philippines.
- In 2011, U.S. drones killed Fazul Abdullah Mohammed in Somalia; the American cleric and al Qaeda propagandist Anwar al-Awlaki in Yemen; al Qaeda’s chief of operations in Pakistan, Abu Hafs al-Shahri; and Atiyah Abd al-Rahman, reportedly al Qaeda’s second in command after Osama bin Laden’s death; along with Ilyas Kashmiri, said to be a senior Pakistani al Qaeda operative. And, of course, we killed Osama bin Laden himself, al Qaeda’s original No. 1.
I left out a few dozen other senior terrorist leaders, most of whom were also killed by U.S. strikes, but you get the idea.
The “important milestones” come and go; we keep on killing bad guys, and the bad guys just keep on keeping on. In the three years since the death of Taliban leader Mullah Omar, the group appears to have gotten stronger, not weaker: Afghanistan experts say the Taliban now control more territory in the country than at any time since before the 2001 U.S. invasion. The Islamic State, the target of intense U.S. attacks since 2014, has lost some of the territory it once held in Syria and Iraq, but the group still managed to pull off mass-casualty attacks in Belgium, Turkey, Libya, Tunisia, Yemen, Kuwait, Pakistan, and Afghanistan — not to mention ongoing brutalities in Syria and Iraq. Al-Shabab and Nigeria’s Boko Haram continue to leave a trail of bloodshed across Africa, and even al Qaeda, which has surely had more “senior leaders” killed than any other terrorist group, continues to stage a zombie-like comeback around the globe, with recent mass attacks in Yemen, Ivory Coast, and Burkina Faso.
So what’s the point of all these killings?
There’s some decent academic theory behind so-called “decapitation strikes”: The idea is that killing senior bad guys throws their organizations into disarray, rendering them less effective, while simultaneously convincing junior bad guys and potential recruits that the risk of death just isn’t worth it.
Intuitively, it’s an appealing idea; it just doesn’t seem to be working out as planned. Maybe that’s because today’s terrorist groups — unlike the older, pre-9/11 organizations upon which the evidence for decapitation strikes was mostly based — are often highly decentralized to begin with, making leadership continuity less important. Maybe it’s because mass-casualty attacks can be carried out cheaply and easily even by rank amateurs these days, given the ready availability of weapons and internet bomb-making instructions. Maybe it’s because people who glorify martyrdom can’t be deterred by high casualty rates.
Either way, it’s difficult to share Obama’s conviction that this time, with the death of Taliban leader Mullah Mansour, we’ve finally turned a corner, showing both the Taliban and the Pakistani government — which decried the strike against Mansour as a violation of its sovereignty — that Washington means business. U.S. counterterrorism strikes may be emotionally satisfying for White House and Defense Department officials, but they come with a cost: Their questionable legal status troubles even key U.S. allies while the death and destruction caused by U.S. strikes can breed resentment in affected communities, potentially boosting rather than undermining terrorist recruitment efforts. This is particularly true when U.S. strikes kill civilians, as some inevitably do.
Here in the United States, domestic efforts to combat terrorism rest on equally shaky foundations. In their recent book Chasing Ghosts: The Policing of Terrorism, John Mueller and Mark Stewart — an American political scientist and an Australian risk assessment expert, respectively — persuasively argue that most counterterrorism programs undertaken by the FBI, the Department of Homeland Security, and local law enforcement are both expensive and pointless. The United States, they note, has created at least 263 new counterterrorism organizations since the 9/11 attacks — a rather high number, given that those same organizations have so far apprehended fewer than 100 people for allegedly planning attacks within the United States. (And many of these terrorist suspects seem to have posed little serious threat: With a handful of exceptions, most were amateurs with grandiose plans far exceeding their competence levels.)
Still, we continue to spend about $115 billion per year on domestic counterterrorism, though an objective assessment of many signature government programs — from TSA security screenings at airports to NSA surveillance and Homeland Security “fusion centers” — suggests, as Mueller and Stewart put it, that many such programs are “all cost and [little or] no benefit.” Compared with typical U.S. government spending on workplace safety, ordinary crime, health care, and automotive safety, they note, we’d need to be saving some 7,000 to 8,000 American lives each year to make this level of counterterrorism spending worth it. Meanwhile, “diverting even a few billion dollars” from the counterterrorism budget each year and putting that money toward, say, “smoke alarms, tornado safety [or] greater car safety” would save far more lives than current U.S. counterterrorism programs.
Here’s the biggest irony: The horrific 9/11 attacks notwithstanding, Americans still face remarkably little risk of dying in a terrorist attack. Worldwide, most terrorist violence occurs in regions riven by armed conflicts. For Americans, terrorism has never been more than a minuscule risk: In a typical year, fewer than 100 Americans are killed in terrorist attacks, at home or abroad. (On average, more Americans are killed each year in lightning strikes, not to mention “ordinary” gun violence.) Yet despite the minimal risk to American lives, the United States spends billions of dollars each year killing terrorist leaders abroad and searching out the exceedingly rare terrorist plot at home.
It reminds me of an old joke:
There’s a guy walking through the streets of New York City, rhythmically banging a stick against every store window he passes.
Finally, a passerby, who’s been watching him for several minutes, stops him and asks, “Why are you doing that?”
“To keep away the tigers,” the man with the stick explains.
“But,” the passerby splutters, “there are no tigers in New York!”
“See?” the man proudly exclaims. “It works!”
Maybe there’s a lesson here. Maybe we shouldn’t bother analyzing U.S. counterterrorism programs through the lens of rationality. Maybe, rather than viewing those counterterrorism efforts as policy and budget choices to which we can reasonably apply economic cost-benefit analysis, we should view them instead through the lens of anthropology.
After all, human societies throughout history have developed magical rituals designed to ward off real or imagined evil. Anthropologists call these apotropaic rituals. From ancient Greece to early Britain, numerous cultures sacrificed animals — and sometimes humans — to propitiate the gods and prevent misfortune. In medieval Europe, ancient China, and pre-European Native American settlements, groups developed elaborate dances and other rituals to prevent drought and dangerous storms. In Europe, medieval pilgrims displayed badges with bawdy images to ward off the plague; in colonial New England, women placed coins once held by corpses under their pillows to prevent male demons from impregnating them while they slept.
We modern Americans don’t believe in demons, rain dances, or the efficacy of sacrificing children or goats. We’ve developed our very own 21st-century magic rituals — and we call them “counterterrorism programs.”
Photo credit: SS MIRZA/AFP/Getty Images