(And a lot of other people.)
- By Rosa BrooksRosa Brooks is a law professor at Georgetown University and a Schwartz senior fellow at the New America Foundation. She served as a counselor to the U.S. defense undersecretary for policy from 2009 to 2011 and previously served as a senior advisor at the U.S. State Department.
Do our best instincts cause our worst failures?
Consider Iraq. Granted, some not-so-great instincts brought us there in 2003 — but it was our best instincts that kept us there. Remember "you break it, you own it?" We took the Pottery Barn Rule seriously enough to stick around for the next eight years, trying earnestly to glue the shattered pieces back together.
The glue never stuck. We couldn’t bring ourselves to believe gloomy predictions that looting might take place, and we convinced ourselves the Iraqis would be better off if we could just get the Baathists out of government and disband Saddam Hussein’s army. But instead of bringing peace and democracy, early U.S. decisions in Iraq led to chaos, revenge killings, a government that could no longer provide the most basic services, and millions of angry, armed, and unemployed young men.
It only got worse. The continued U.S. presence sparked an insurgency and brought al Qaeda to Iraq. While the Iraq War’s civilian death toll remains disputed, it was certainly high: during the eight years of U.S. military engagement, an estimated 162,000 Iraqis were killed, mostly civilians. Violence began gradually to subside after 2006, for reasons most analysts suspect had little to do with the U.S. troop surge. In 2011, we finally slunk off, tails between our legs. Behind us, we left a squabbling, barely functional government, an economy still in shambles, and a level of civil violence that remains astronomically high.
My point here is not that the Iraq War was a bad idea in the first place (though it certainly was). It’s more depressing than that. My point is that this cynical, foolish, arguably illegal war might still have come right in the end if we had tried a little less hard to fix everything. I remember Iraq in summer 2003, before the bombing of the U.N. building, before suicide bombs and IEDs became near daily occurrences. There was a brief window that spring and summer, a window in which the mood in Baghdad was cautiously celebratory. People spoke freely. NGOs and human rights groups popped up out of nowhere. Saddam was out; hope was in. We should have left then.
What would have happened if the U.S. government had been less determined to fix Iraq’s broken parts? What would have happened if we’d brought our troops back home in summer 2003? What if we’d quelled our national do-gooder instincts, and left the Iraqi army and all but a handful of top Baathist officials in place, offering the rest their lives, liberty, and some generous economic assistance in exchange for genuine cooperation on weapons inspections?
There’s no way to know for sure, but I have an uneasy feeling that a more cynical U.S. government approach from the get-go — an approach that never even contemplated the restoration of democracy — might ultimately have caused less bloodshed.
Or consider Afghanistan, where we recently lost the 2000th U.S. service member. We’ve been at war in Afghanistan for over a decade, struggling to keep the Taliban at bay, build up a democratic government, eliminate corruption, and create an Afghan military capable of defending the population and disinclined to prey on it. All worthy aims — and to achieve them, Presidents Bush and Obama let U.S. troop levels creep up over the course of eleven years, going from fewer than 10,000 in late 2002 to more than 30,000 in November 2008. By mid-2009, Obama had doubled that number. By mid-2010, he had tripled it. And although Obama’s promised "civilian surge" never got very surge-like, we did substantially increase the number of U.S. civilian officials tasked to help with Afghan governance and development issues.
The result? Today the Afghan government remains corrupt, insecurity remains rampant, civilian deaths directly and indirectly attributable to the U.S. presence remain high, and a 2011 poll found that 76 percent of Afghans say they feel "some fear" or "a lot of fear" when encountering international coalition forces. If the Afghan population doesn’t trust us much, we now trust them even less: green-on-blue attacks have spiked, killing more than 50 Americans so far this year. Meanwhile, the Taliban have decided they can’t be bothered to negotiate with us, and as in Iraq, we’re now limping ignominiously towards the exit.
What if the United States had done things differently? What if we had pummeled al Qaeda’s strongholds, helped the Northern Alliance oust the Taliban, and then…left? If we had left early in 2002, we could have continued to strike al Qaeda targets of opportunity as needed, using special operations forces and aerial attacks; we could have used diplomacy and foreign aid to urge governance and human rights reforms. Perhaps there wouldn’t have been many reforms. But would things really be any worse than they are today?
Here again, my point isn’t that the war in Afghanistan was a mistake, or that our efforts in Afghanistan have fallen badly short, an argument that has been made often and persuasively. And I’m not arguing that we’re now "less safe" than we used to be, or insufficiently "more safe" — claims that have always struck me as hard to prove one way or the other. My point is that it has often been our best instincts, not our worst, that have led us to do harm in the world. In Afghanistan and Iraq, we spent billions of dollars and suffered thousands of U.S. casualties. Worst of all, we caused untold suffering for the very populations we so earnestly intended to help.
I’m not suggesting that the United States is all idealism, all the time: we’re capable of plenty of cynicism, and occasional acts of plain old evil. But even our most cynical moments are accompanied by idealism. We want to help, and we want to set things right. We want everyone to share in peace, justice, and the benefits of the American way — even if it hurts.
And hurt it does. The United States and those we try to "help" are is often the victims of our own idealistic commitments to democracy, human rights, and the rule of law.
This phenomenon plays out at a micro level as well a macro level. U.S. construction and economic development projects take far too long and cost far too much, in part because we want everything to satisfy stringent U.S. and international quality standards. Our reconstruction projects are so elaborate that only those Afghans who already have wealth and power have the capacity to serve as sub-contractors; by and large, the result is that power is concentrated even more in the hands of the (often corrupt and violent) few.
Even U.S. detention facilities in Afghanistan are built to Western specifications, complete with climate control systems and electronic security. As a result, we render them virtually useless for the Afghan officials who will inherit them — and who won’t have the trained staff or the unlimited supply of electrical power to make them run. But the suggestion that the Afghans might sometimes be better off with less reeks, to us, of unacceptable double standards.
Or consider a larger and more tragic irony: by late 2009, the United States had embarked on a counterinsurgency-influenced approach to the conflict in Afghanistan: the Afghan population, we decided, was the center of gravity. Our success or failure would depend on our ability to protect the population and enable the Afghan government to provide services and thus build legitimacy. Laudable goals! But by making the Afghan population the center of gravity, we also inadvertently placed the Afghan population at the center of a big red bulls-eye. We incentivized the Taliban to combat our efforts by placing IEDs in civilian structures and targeting police, courts, governance and economic development projects. They did so, with a vengeance.
I could go on — and on, and on — but it’s too depressing.
It’s not surprising that we often fail to achieve our idealistic goals. After all, building a culture that respects human rights, democracy, and the rule of law takes time. Our own imperfect form of democracy — rife as it still is with injustice and corruption — took us more than two centuries to build, though we stood on the shoulders of those who drafted the Magna Carta and the English Bill of Rights. So why should we imagine that durable change could come any faster in societies that start with far less — less wealth, less education, less tradition of democratic government, human rights, or peaceful change?
Simple failure to achieve our loftiest goals could be excused. But if our efforts to help only cause more harm, it’s inexcusable.
Scarred by Vietnam, my parents’ generation came of age with a deep distrust of American power. They suspected that American interventionism never stemmed from pure motives, and never, ever, ended well. My generation came of age at a more hopeful moment: the Berlin Wall came down while I was in college, and the notion of non-ideological U.S. engagement with the world seemed suddenly possible again.
The Rwandan genocide taught my cohort that non-intervention can be as unconscionable as meddling, and Bosnia and Kosovo taught us that U.S. military power could be a force for good. My own early career revolved around human rights work, and brought me to Uganda during the early years of the Lord’s Resistance Army, Kosovo in the wake of the NATO air campaign, and Sierra Leone during that country’s brutal civil war. In each case, U.S. engagement seemed urgent and necessary.
But after all the waste and bloodshed in Iraq and Afghanistan, I’ve lost much of my faith in our government’s ability to do good. The injustice and abuse that once motivated me still does — but I don’t have much faith anymore in our ability to restore peace or bring justice.
I’d love to have someone prove me wrong. But here’s my fear: the more we try to fix things, the more we end up shattering them into jagged little pieces.