The War We Deserve

It's easy to blame the violence in Iraq and the pitfalls of the war on terror on a small cabal of neocons, a bumbling president, and an overstretched military. But real fault lies with the American people as well. Americans now ask more of their government but sacrifice less than ever before. It's an unrealistic, even deadly, way to fight a global war. And, unfortunately, that's just how the American people want it.

There’s an uncomplicated tale many Americans like to tell themselves about recent U.S. foreign policy. As the story has it, the nation was led astray by a powerful clique of political appointees and their fellow travelers in Washington think tanks, who were determined even before the 9/11 attacks to effect a radical shift in America’s role in the world. The members of this cabal were known as neoconservatives. They believed the world was a dangerous place, that American power should be applied firmly to protect American interests, and that, for too long, U.S. policy had consisted of diplomatic excess and mincing half measures. After 9/11, this group gave us the ill-conceived Global War on Terror and its bloody centerpiece, the war in Iraq.

This narrative is disturbing. It implies that a small cadre of officials, holding allegiance to ideas alien to mainstream political life, succeeded in hijacking the foreign-policy apparatus of the entire U.S. government and managed to skirt the checks and balances of the U.S. Constitution. Perversely, though, this interpretation of events is also comforting. It offers the possibility of correcting course. If the fault simply lies in the predispositions of a few key players in the policy game, then those players can eventually be replaced, and policies repaired.

Unfortunately, though, this convenient story is fiction, and it’s peddling a dangerously misguided view of history. The American public at large is more deeply implicated in the design and execution of the war on terror than it is comfortable to admit. In the six years of the war, through an invasion of Afghanistan, a wave of anthrax attacks, and an occupation of Iraq, Americans have remained largely unshaken in their commitment to a political philosophy that demands much from its government but asks little of its citizens. And there is no reason to believe that the weight of that responsibility will shift after the next attack.


Since at least the election of Ronald Reagan in 1980, a political philosophy known as neoliberalism has dominated the American political landscape. Defined by a commitment to tax reduction, discipline in fiscal and monetary policy, light regulation of the private sector, and free trade, it has risen above party politics. Leading Democrats have advocated the neoliberal creed, even if they did not use the phrase. It was former President Bill Clinton, after all, who promised the American people in 1996 that "the era of big government [was] over"; that the federal bureaucracy would shrink; and that the federal government would adhere to a program of fiscal balance, regulatory restraint, and trade liberalization.

This neoliberal philosophy is built on a bedrock of skepticism about the role of central government and the effectiveness of grand governmental projects. As a consequence, politics got small. Political leaders learned to shy away from policies that threatened to disrupt the status quo and make great demands of the American polity. A hallmark of the Clinton administration in its later years, after the Democrats’ drubbing in the 1994 midterm elections, was its enthusiasm for "micropolicies" — initiatives that could be linked to great themes but did not incur great costs.

This rejection of sacrifice on a national scale contributed to the bungled war the United States finds itself in today. The war on terror is not simply a neoconservative project. It is as much a neoliberal project, shaped by views about the role of government that enjoy broad public support.

It may seem extraordinary, given the experience of the past six years, to suggest that President George W. Bush’s administration pursued a Clinton-style strategy of accommodation to neoliberal realities. After all, key Bush advisors flaunted their determination to throw off the constraints that bound the executive branch. And the Bush administration’s policies have had cataclysmic consequences — in Iraq alone, there are tens of thousands dead and more than a million people displaced. How can we call this "small politics"?

However, we must first recognize the critical distinction between what the Bush administration intended to do, and what actually transpired. The material point about the planned invasion of Iraq was that it appeared to its proponents to be feasible with a very small commitment of resources. It would be a cakewalk, influential Pentagon advisor Kenneth Adelman predicted in February 2002. The cost of postwar reconstruction would be negligible. Former Deputy Secretary of Defense Paul Wolfowitz suggested that it might even be financed by revenues from the Iraqi oil industry.

Of course, there were critics inside and outside the U.S. government who warned that these forecasts were unduly optimistic. But the administration’s view was hardly idiosyncratic. There were many Americans who believed, based on the experience of the previous decade — including the first Gulf War, subsequent strikes on Iraq, and other interventions such as Kosovo — that the U.S. military had acquired the capacity to project force with devastating efficiency. Consequently, it wasn’t hard to imagine that the invasion and occupation of a nation of 27 million, more than 6,000 miles away, could be accomplished without significant disruption to American daily life.

Even the larger war on terror remains a relatively small affair, asking for little from its masters. Although U.S. defense expenditures have grown substantially during the Bush administration — by roughly 40 percent in inflation-adjusted terms between 2001 and 2006 — it is growth from a historically low base. In the five years after 9/11, average defense expenditure as a share of gross domestic product (3.8 percent) was little more than half of what it was during the preceding 50 years (6.8 percent). The proportion of the U.S. adult population employed in the active-duty military (roughly 0.6 percent) remained at a low not seen since before the attack on Pearl Harbor.

This determination to execute policy without disrupting daily life was maintained even as it became clear that the war on terror was faltering. The U.S. "surge" of troops in Iraq beginning in January 2007, designed to wrest control of the country from insurgents, was advertised as a substantial increase in U.S. commitments in Iraq. In August, the New York Times called it a "massive buildup." But by historical standards, it has been negligible. The United States had more boots on the ground in Japan 10 years after its surrender in 1945 and in Germany at the end of the Cold War. It deployed twice as many troops in South Korea and three times as many in Vietnam.

In 2003, the conflict in Iraq might reasonably have been described as George W. Bush’s war. In 2007, however, it has become a bipartisan war — that is, a conflict whose course is shaped by the actions of a Republican president and by Democratic majorities in Congress. The stakes are substantial: Continued failure in Iraq is bound to have tremendous human and diplomatic costs. Yet the range of policy options is still arbitrarily limited to a token "surge" or various forms of "phased withdrawal." No major political actor, Democrat or Republican, dares to contemplate a genuine surge that would raise the U.S. commitment in Iraq to the level said to be essential by several military leaders before the invasion. Similarly, there has been no serious consideration of a return to the draft, despite strains on the U.S. military. This, the New York Times said — echoing the argument made by Milton Friedman during Vietnam — would be inconsistent with the "free-choice values of America’s market society."


It isn’t just in Iraq where this preference for small-scale politics has shaped the war on terror. Vice President Dick Cheney claimed in 2005 that the Bush administration had been "very aggressive . . . using the tools at our disposal" to defend the U.S. homeland against terrorism. But the administration has not been aggressive in imposing regulatory burdens on the private sector, which owns many of the United States’ most vulnerable targets. In 2002, for example, it declined to assert that the Clean Air Act gave it authority to impose safety requirements on U.S. chemical facilities. Instead, it encouraged voluntary corporate efforts to improve security in this and other sectors of the economy. First and foremost, the aim was to keep the economy humming. "One of the great goals of this nation’s war," President Bush said immediately after 9/11, "is to restore confidence in the airline industry." His administration quickly launched a "pro-consumption publicity blitz" (in the words of the Boston Globe) on behalf of the U.S. travel industry. The president starred in a campaign by the Travel Industry Association of America, designed, as one industry executive put it, to "link travel to patriotic duty." Many Americans interpreted the campaign as a call to spend more money to boost the economy. "The important thing, war or no war, is for the economy to grow," then White House Press Secretary Ari Fleischer said in 2003.

The extension of such pro-growth policies in a declared time of war has created jarring rhetorical inconsistencies. Historically, war has been regarded — by definition — as a grand project, requiring deep societal shifts and the subordination of other priorities. Traditionally, presidents had the responsibility to remind citizens of this fact. They called on Americans to "make the sacrifices that the emergency demands," as President Franklin D. Roosevelt did less than a year before Pearl Harbor.

President Bush feigns continuing this tradition when he tells Americans that "a time of war is a time of sacrifice." But this attempt to link the war on terror and earlier campaigns fails, precisely because today the state is not able to demand comparable sacrifices from its constituents. Asking for real sacrifices and tax hikes doesn’t go over well at the ballot box. And so, when President Bush was asked in 2001 precisely how Americans should contribute to fighting the war on terror, he replied: "Well, I think the average American must not be afraid to travel. … They ought to take their kids on vacations. They ought to go to ball games. … Americans ought to go about their business." Another sad attempt at analogy was made by the new Transportation Security Administration (TSA), which launched a research-and-development program to achieve "revolutionary" improvements in methods for screening airline passengers and baggage. Part of this program was coined the "Manhattan II Project" — an homage to the crash program that produced a nuclear bomb in the Second World War. But the original Manhattan Project was a vast and expensive enterprise. It consumed 1 percent of GDP (in contemporary terms, one quarter of the annual defense budget); employed 130,000 people; enlisted scientists and engineers from leading universities and private industry; and continued until an atomic weapon had been successfully detonated. Manhattan II, by contrast, expended $6 million in its first two years — or less than one ten-thousandth of 1 percent of U.S. GDP.

The TSA’s research-and-development program eventually fell prey to "competing priorities in a tight budget environment." In 2003, the agency cut its R&D budget by half to meet shortfalls elsewhere in the agency. R&D was just one of many areas in which investments in homeland security clashed fatally with fiscal constraints. "From Day One," said Clark Kent Ervin, the former inspector general of the Department of Homeland Security (DHS), attempts to improve domestic security were compromised by a "lack of money."

Conservatives often countered that one consequence of the war on terror has been a complete loss of budgetary discipline in Washington. It is a gross exaggeration. On the contrary, a remarkable feature of post-9/11 budgeting is the extent to which federal expenditures continued long-standing patterns, even as policymakers profess commitment to a "total war" on terrorism. Federal spending rose from 18.5 percent of GDP in 2001 to 20.3 percent of GDP in 2006, but it was not an extraordinary shift: The average for the 40 years ending in 2006 was 20.6 percent of GDP. Spending on non-entitlement domestic programs, including DHS programs, remains well within historical parameters. Indeed, the largest increase in annual federal outlays between 2000 and 2006 was not on non-defense discretionary spending (a change of $177 billion) or even on defense ($225 billion), but on Social Security, Medicare, Medicaid, and other forms of mandatory spending ($461 billion).

Broadly speaking, three familiar forces have shaped federal budgets since 9/11. The first is the continuing weight of the doctrine of fiscal discipline, which discredits the idea of borrowing to fund security needs. The second is the continued difficulty of entitlement reform, evidenced by the Bush administration’s failed attempt to reform Social Security in 2005, and thus the inability to control growth in mandatory spending.

The third consideration is a resistance to taxation. As a share of GDP, federal taxes increased to their highest level in modern history during the later Clinton years, and popular resentment about federal taxes grew right alongside them. As a consequence, the popularity of the Bush administration’s tax cuts endured after 9/11. One month after the attacks, three quarters of respondents told a Gallup poll that they wanted the first round of tax cuts, introduced in June 2001, to take effect immediately. More than 60 percent favored additional cuts. Bush’s tax reforms reduced the overall tax burden to the historical average. The result, however, has been another stark discontinuity between the rhetoric of the war of terror and the realities on the ground. President Bush claimed that the United States had undertaken a "historic mission" after 9/11. But this was a highly unusual security crisis — one in which the tax burden imposed by the federal government declined. The president even defended the tax cuts as a critical component of protecting the home front in the war on terror, designed to safeguard an economy targeted by al Qaeda. Tax cuts, said Bush, would "make sure that the consumer has got money to spend in the short term." The effects of this pro-consumption policy have sometimes been perverse. For example, containerized maritime imports rose by 64 percent after 9/11, even as policymakers wrung their hands about weaknesses in container and port security.


So, if taxes have declined, a draft has been avoided, and regulatory burdens have been minimized, were there any other ways in which the broad mass of the American public might have carried a heavier load after 9/11?

Civil libertarians certainly think Americans have paid a large if intangible price in the rollback of their civil liberties. Here, critics also reach for analogies between the war on terror and earlier conflicts. They accuse the Bush administration of trampling on civil liberties in the name of national security, just as the government had during the First and Second World Wars, the Cold War, and the domestic turmoil of the late 1960s and early 1970s. The steps taken after 9/11 were "chillingly familiar," reported the San Francisco Chronicle. The historian Alan Brinkley said the government’s treatment of civil liberties was a "familiar story." In 2002 The Progressive said, "We’ve been here before."

But we haven’t been here before. Infringement of Americans’ rights after 9/11 — that is, actual rather than anticipated infringements — were different in type and severity than those suffered in earlier crises. Citizens were not imprisoned for treason, as they were during the First World War. Thousands of citizens were not detained indefinitely, as they were during the Second World War. Citizens were not deported, or denied passports, or blacklisted, as they were during the Red scares.

Were there serious issues about the denial of citizens’ rights after 9/11? Undoubtedly. But those violations often had a distinctly postmillennial character. New surveillance programs were launched in secrecy and designed so that their footprint could not be easily detected. In effect, government was adapting to political realities, searching for techniques of maintaining domestic security that did not involve obvious disruptions of everyday life.

Foreigners, in contrast, had a much rougher time. The second distinguishing feature of the war on terror, insofar as basic rights are concerned, is the extent to which the heaviest burdens were sent abroad. The most obvious and grievous harms — kidnapping, secret detention, abusive interrogation, denial of habeas corpus — have been deliberately perpetrated against foreigners rather than citizens.

Indeed, the spirit of sacrifice doesn’t even permeate the Bush administration itself. The president’s first choice as head of the Federal Emergency Management Agency resigned 15 months after 9/11 to head a consulting business that would "take advantage of business opportunities in the Middle East following the conclusion of the U.S.-led war in Iraq." By 2006, two thirds of the Department of Homeland Security’s senior executives had departed, often taking more lucrative positions as lobbyists for DHS contractors. A 2007 report for the Homeland Security Advisory Council worried about a "Homeland Security ‘meltdown’" because of turnover in DHS leadership.

Some have borne serious costs in this war on terror, though, beginning of course on Day One. And as of today, more than 4,000 U.S. soldiers have been killed in Iraq and Afghanistan, and more than 13,000 have been injured so severely that they could not return to duty. At least 70,000 Iraqis have died since March 2003. Immigrant communities within the United States complain of surveillance and the abusive application of immigration laws. These pains are substantial but concentrated, often on politically marginal constituencies.

In fact, some of these costs are aggravated precisely because the Bush administration wants to avoid policies that incur more broadly distributed burdens. "We’re fighting the enemy abroad," the Bush White House said in 2005, "so that we don’t have to fight them here." This course of action has been presented as a matter of national security — a sensible form of forward defense. However, it’s also good domestic politics. "Fighting them here" would mean higher taxes, bigger bureaucracies, tighter regulation, clearer challenges to civil liberties, and more impediments to trade. The Bush administration did not want that, because it understood correctly that most Americans did not want it, either.

But the consequences of this basic policy decision have been profoundly harmful to U.S. interests. The United States has failed to take steps domestically that would enhance security. It has stumbled into overseas conflicts marred by poor planning and vague objectives. Its standing as a champion of human rights has been badly damaged. It has left itself open to charges of hypocrisy — using the language of sacrifice to cloak a policy of business as usual.

Meanwhile, Americans are still as committed to the principles of fiscal discipline, low taxes, light regulation, and free trade as they were on Sept. 10, 2001. They remain deeply skeptical of "big government" and federal bureaucracy. Indeed, a 2006 Gallup poll found that a large majority of Americans, Republicans and Democrats alike, believed that "big government" posed the biggest threat to the country in the future, ahead of both "big business" and "big labor."

Was the war on terror devised and promoted by a small cadre of neoconservatives? Perhaps. But it was also a response to crisis that recognized and largely respected the well-defined boundaries of acceptable political action in the United States today. In important ways, the war on terror is not their war but our war. The desires and preferences of the American people have shaped the war on terror just as profoundly as any neoconservative doctrine on the conduct of U.S. foreign policy.

So Americans can keep reciting their favorite myth — even as the broad currents of U.S. politics remain unchanged. But the next time the war suffers a setback or a terror attack hits home, we shouldn’t expect the country’s response to differ much from the war the Bush administration launched six years ago. Americans might try to pin their problems on a few powerful neocons. In truth, though, they must shoulder some of the blame.

Alasdair Roberts is director of the School of Public Policy at University of Massachusetts Amherst. He is a fellow of the National Academy of Public Administration. His latest book is Can Government Do Anything Right?