Argument

The First Step

The First Step

Ten years ago this week the United States launched the invasion and occupation of Iraq. Ten years later we can and should say that the United States lost that war.

Recognizing that the United States lost (and why) is a bit like an alcoholic admitting that he has a problem — it’s the first step on the road to recovery and preventing such grievous errors from being repeated in the future.

Defining a wartime loss can be a tricky thing. Clearly, the United States does not look defeated in the way that, say, Japan looked at the end of World War II. But if one uses the Bush administration’s criteria for what the war in Iraq was supposed to accomplish, it’s a loss all the same.

The initial rationales used by President Bush to justify military action in Iraq revolved around two ideas: one, that Iraq had stockpiles of weapons of mass destruction and the inclination to build more; and two, that Iraq had direct links to terrorist organizations that could use these weapons to attack the United States. As Bush said at the time, "We are now acting because the risks of inaction would be far greater. In one year or five years, the power of Iraq to inflict harm on all free nations would be multiplied many times over." Removing Saddam Hussein — one of the objectives that the United States did accomplish in Iraq — was viewed as a key means for lessening these alleged threats.

But we now know that Iraq did not have weapons of mass destruction, or even an active program to build them; it was not attempting to rebuild its nuclear program; it did not have links to al Qaeda; it was not working with international terrorist groups to attack the United States. Iraq was a neutered, battered, and marginalized nation, hemmed in by international sanctions and the enforcement of no-fly zones in the north and south of the country. Saddam was a threat to no one but his own people.

Acolytes of the Bush administration (and they are a small group) could argue that the United States was not fully aware of Iraq’s vulnerable state, its lack of WMD, and its mythical links to terrorism. The problem, of course, is that one doesn’t get a mulligan when it comes to invading and occupying a foreign country and squandering thousands of American lives (and some 100,000 Iraqi lives), as well as trillions of dollars. Even if the Bush administration was misled by CIA intelligence or had the best of intentions in Iraq, that’s an explanation (and an unbelievably charitable one) — not an excuse.

After all, containment of Iraq was working — and was significantly strengthened in the run-up to the war. In May 2002, the U.N. Security Council unanimously approved a program of "smart sanctions," overcoming long-standing Russian opposition and renewing international support for keeping Saddam in his box. And a few months later, the Security Council again unanimously approved a resolution demanding Iraq allow U.N. inspectors back into the country — which it did in January 2003. The initial IAEA and U.N. monitoring teams reported that they could find no evidence of WMD in Iraq, but that more inspections were needed. Perversely, this was viewed in Washington as evidence of continued Iraqi obfuscation and even more reason to invade the country. Even if one believes that the sanctions regime was wavering, the steps taken by the United Nations were compelling evidence that containment of Iraq could be maintained — and that the allegedly urgent need for invasion and occupation was thoroughly overstated.

Of course, in the years after the invasion — and after no WMD or links to terrorism were found — the rationale for sustaining the U.S. presence in Iraq shifted: from removing Saddam as a threat to birthing liberty between the Tigris and the Euphrates. While putting Iraq on the road to stability and even democracy might be a laudatory goal, it was certainly never the war’s primary objective.

Regardless, can anyone seriously argue that building a democracy in Iraq was worth the price tag? Whatever your priorities, the opportunity costs alone are stratospherically large. The $3 trillion spent by the United States in both direct and indirect costs could have balanced the budget (several times over), fixed the nation’s crumbling infrastructure, ensured that every person with HIV had access to life-saving medications, or ended world hunger.

In the end, a war that was predicated on dubious suppositions, diverted trillions in resources, cost thousands of lives, and did not further U.S. security interests really only has one descriptor:  a loss.

So why does this matter? Does the United States really need to engage in self-flagellation over the war in Iraq? Actually, yes it does.

To be sure, accepting defeat in war is a rare thing. As Paul Fritz, a political science professor at Hofstra University said to me, "No one likes to label themselves a loser and most of the time will find ways to rationalize, deflect, or otherwise not accept the fact and consequences of a loss, implicitly or explicitly." History is replete with such examples. While the most vivid example is perhaps the stabbed-in-the-back meme that followed Germany’s loss in WWI, even after the more total defeat of WWII, Germany initially saw itself as a victim and only years later accepted culpability for its actions during the war. In the case of Japan, penance was imposed by the United States and even to this day there has been little of the national soul-searching in which Germany eventually engaged.

In the United States, after Vietnam, while there was perhaps not an explicit recognition of defeat, there was an understanding that steps must be taken to avoid such a calamitous and ill-advised conflict in the future. That belief became the impetus for the so-called Weinberger Doctrine, which established clear criteria for the use of force: a vital national interest must be at stake; wars should be fought wholeheartedly; political and military objectives must be clear; those objectives and the size of the force engaged much be constantly reassessed and adjusted; the support of Congress and the American people is paramount; and, finally, force must be the last resort. Colin Powell later expanded on these tests with the inclusion of overwhelming military force and full consideration of the consequences of military action.

Across three administrations — Reagan, Bush 41, and Clinton — civilian leaders largely abided by the spirit if not always the letter of the doctrine. While occasionally the definition of vital national interest was stretched very far, by and large military engagement was de-emphasized, the political impact of extended wars was given greater consideration, and when fought, wars were limited and were conducted on a grand scale (the first Gulf War). Allied support and a strong military force were also considered essential.

In the years since the Iraq War has wound down, there has been little of the same reflection. Rather, military and, in particular, civilian leaders took away from Iraq not the strategic mistakes and faulty assumption that underpinned the decision to go to war, but rather the tactical advances that allegedly salvaged the conflict in the 2007. This is, for lack of a better term, "surge triumphalism." It reflects the abiding belief among military leaders that the U.S. Army learned crucial lessons in waging counterinsurgency — and is why they pushed for more troops in 2009 to wage the war in Afghanistan.

However, a proper appreciation of the lessons of Iraq — not just by the military but also by civilian leaders — should have led to different outcomes in Afghanistan. What is most startling about Afghanistan is the extent to which U.S. policymakers — even from a new administration and a different party — made so many of the same mistakes as their predecessors had made in 2003. No clear pol
itical objectives were established; options other than U.S. boots on the ground were rejected, including containment; the process of political development and governance was deeply misunderstood; the potential threats to U.S. national security from inaction were dramatically overstated; and there was once again a complete lack of appreciation for the limits of American diplomatic and military power. Hubris and strategic incoherence drove the United States to fight an ill-advised war in Iraq; it led to similar misjudgments six and a half years later in the decision to surge in Afghanistan.

During the Afghan surge debate in 2009, former British Army officer Rory Stewart recounted his experience meeting with U.S. officials about the decision to send more troops to Afghanistan: "‘It’s like they’re coming in and saying to you, I’m going to drive my car off a cliff. Should I or should I not wear a seatbelt?’ And you say, ‘I don’t think you should drive your car off the cliff.’ And they say, ‘No, no, that bit’s already been decided — the question is whether to wear a seatbelt.’" The problem on both Iraq and Afghanistan was that the decision to use force was made before the United States even figured out what it wanted or could accomplish — and that useful criteria for thinking about the use of American military power were ignored.

Such mistakes cannot be made again in the future. But with policymakers from both sides of the political aisle rattling sabers with Iran, they very well could.

"Don’t fight stupid wars" is perhaps the most enduring lesson of the past 10 years in American foreign policy — and it’s pretty good advice for the future. But refusing to fight dumb wars begins with understanding how the United States found itself fighting two of them back to back. It’s not enough to simply recognize that Iraq was a strategic error; we must understand the many reasons why. And that begins with recognizing that America lost the war in Iraq before a single shot was ever fired.