Elephants in the Room
The Lessons of 1944 Are in Jeopardy
Seventy-five years after D-Day, the United States should remember that on-the-ground leadership still works.
Thursday marks the 75th anniversary of D-Day, the allied invasion of the Nazi’s so-called Fortress Europe during World War II. This anniversary is particularly poignant because it is the last major commemoration for which a significant number of survivors who were actually there will be around to tell us about it. The D-Day generation has been rightly celebrated for what it achieved. Today more than ever, we need to remember what they learned.
D-Day is memorialized as a great battlefield success that led to a great strategic victory and the onset of Pax Americana. Yet it is at least as important to keep in mind what D-Day replaced: a grand strategy of retreat to avoid European troubles—an approach that catastrophically failed to protect U.S. interests. At a time when new voices are once again calling for strategic retreat, it is the history of the 1930s and the dangerous illusions Americans then entertained, not just the heroics on the coast of Normandy, that we need to remember.
D-Day has been called the hinge of history of the 20th century. It was certainly a turning point in World War II. Had the Germans rebuffed the invasion—as Gen. Dwight D. Eisenhower, the supreme commander of Allied forces in Europe, feared—the chance for an Allied victory as early as 1945 would have likely been lost. It is not impossible to imagine that an Allied defeat would have broken the political will for demanding unconditional German surrender, paving the way for some sort of negotiated armistice like the one that ended World War I. Of course, it was precisely that sort of ambiguous outcome that had left the post-World War I political settlement so fragile and susceptible to dissolving into another major war just two decades later.
D-Day was also a turning point in terms of the United States’ global role, when the country first began to maintain a continental military presence in the heart of Europe, which continues to this day. After World War I, the United States undertook, as President Warren G. Harding put it, a “return to normalcy”—an effort to avoid military entanglements and let other countries manage their security problems on their own. Across the spectrum of political leaders, intellectual elites, and the general public, there was a collective revulsion at the human toll of World War I and its seeming purposelessness. The revulsion led to the widespread view that the United States could avoid such wars in the future simply by minding its own business.
In grand strategic terms, the post-World War I posture was an experiment in offshore balancing, meaning that the United States eschewed a ground commitment on European soil, maintaining diplomatic relations and profitable economic relations with all sides. A similar strategic approach prevailed in East Asia as well, with the United States generally looking the other way through the 1930s as Japan took Manchuria, North China, and then in 1937 started a full-scale invasion of China that led to a push into Southeast Asia. The strategy was based on the hope that great power rivals would maintain a close-enough equivalency in power so that each would check the other and thereby avoid war. Nowadays, this might also be called a strategy of restraint, since it involved choosing the mind-your-own-business and do-less-if-not-nothing option when “a quarrel in a faraway country between people of whom we know nothing” erupted, in British Prime Minster Neville Chamberlain’s memorable words.
Offshore balancing and restraint failed miserably in the interwar years, and D-Day marked the moment when the butcher’s bill came due in the most vivid way. It also was the moment when the United States shifted to the opposite strategy: on-the-ground U.S. military leadership. It turned out U.S. restraint did not empower European leaders to sort out their quarrels through methods short of war. Instead, restraint led those quarrels to fester into bloody political conflicts and from there into a major war. Offshore efforts at balancing turned out to be ineffective, failing in a horribly costly way.
By 1941, it was obvious to all but the most ideologically blinkered that problems abroad, which affected major U.S. economic and political partners, would not stay abroad and would eventually impinge on U.S. freedoms. When Adolf Hitler’s Germany declared war on the United States, following Japan’s attack on Pearl Harbor, the political foundation for a leading global role was set.
The military foundation took a bit longer to come together, and it really wasn’t until the success of the D-Day invasion that the military wherewithal caught up with the political strategy. After years of restraint, the United States entered into World War II with a paltry military capacity far below its underlying economic capacity. It took a massive rearmament program and some bloody and painful on-the-job training in North Africa and Italy to forge a military capable of defeating the mighty Wehrmacht sitting behind its fortress walls.
And it took the extraordinary courage and sacrifice of the men and women who prepared and then executed the greatest amphibious invasion in human history, a tiny few of whom remain with us today to bear witness to what they achieved.
Today, this lesson remains relevant, perhaps more so than at any time since the 50th anniversary, when many more of their comrades could join them in tribute.
For the lessons learned on D-Day are denied by political leaders and intellectual elites more energetically today than on any day since June 6, 1945. These lessons—that problems left to fester over there do not get better but come to haunt us over here, that the world is safer when the United States builds and wisely wields a military capacity commensurate with its resources, that U.S. interests are best served by leading a coalition of allies and partners in concert rather than retreating to a cramped and short-term vision of parochial nationalism—were hard-learned but have gone out of fashion in the academy and in many political corners as well.
To be sure, there have always been prominent voices arguing against a U.S. leadership role. Some thought the United States could manage well enough even if Hitler dominated the continent and opposed U.S. entry into World War II. Even after the war, the desire for something like a return to normalcy was strong enough to trigger a precipitous demobilization to the point where U.S. forces were ill-equipped and nearly routed during the initial phase of the Korean War, barely a half-decade after D-Day. U.S. political leaders have always had a certain ambivalence about America’s European commitment, and NATO’s role has been hotly debated within policy circles since its inception.
Nevertheless, after carefully weighing the alternatives, policymakers have concluded over and over again that the savings promised by a supposedly cheaper policy of restraint and full offshore balancing were likely to be illusory. Instead, a hardheaded policy realism has recognized that a European commitment better protects U.S. values and interests over the long-term. The burdens that such a policy imposes, though real, are manageable given America’s overall strength.
The alternative to that prudent policy of leadership is the one that has produced the most terrible costs, which we memorialize again this week: soldiers, sailors, and aircrew obliged to risk their lives to confront a problem that could not be wished away.
Peter D. Feaver is a professor of political science and public policy at Duke University, where he directs the Program in American Grand Strategy.