An expert's point of view on a current event.

Those Who Don’t Investigate the Past Are Doomed to Repeat It

The U.S. government has rejected the chance to study this year’s insurrection. They’ll soon regret the decision.

By , the Robert and Renée Belfer professor of international relations at Harvard University and a columnist for Foreign Policy.
Supporters of former U.S. President Donald Trump gather outside of Trump Tower.
Supporters of former U.S. President Donald Trump gather outside of Trump Tower during a rare visit Trump made to his New York offices in New York City on March 8. Andrew Lichtenstein/Corbis via Getty Images

A few years ago, I had the opportunity to spend a day and a half on board the aircraft carrier USS Theodore Roosevelt as it conducted exercises off the coast of Florida in preparation for an extended deployment. It was a fascinating experience, and one of the activities that impressed me most was watching night landings by the ship’s F-18 squadron. An especially interesting part, at least for me, was observing the debriefing each pilot had to undergo after every flight, in which members of the crew went over each landing and explained what the pilot had done correctly and what needed to be improved. The purpose of these sessions was clear: To maintain a high level of performance, you had to learn from any mistakes you made so you could do better in the future.

I thought of that experience as I read about the Republican Party’s successful effort to prevent Congress from creating an independent bipartisan commission to investigate the violent assault on the U.S. Capitol on Jan. 6. As shocking as the initial attack was, the GOP’s campaign to keep the American people from learning from the experience was, if anything, more disturbing. They were obviously worried an investigation might cast an unfavorable light on their party—and especially on the former president, who egged on the invaders—and they’d prefer the country as a whole never knows exactly what happened on that day or why.

One of the supposed strengths of democracies is their tolerance for open discussion, which makes it easier to identify mistakes, alter course, and hold those responsible accountable. But that self-correcting capacity is being eroded these days—and not just here in the United States. In Hungary, for example, Hungarian Prime Minister Viktor Orban’s Fidesz party has systematically strangled independent media, imposed its own authority over the judiciary, forced its best university—the Central European University—to relocate to Vienna, and is now centralizing its control over the rest of Hungarian academia. Open debate and honest inquiry are no longer welcome. Turkish President Recep Tayyip Erdogan has followed a similar playbook for years, and Brazilian President Jair Bolsonaro and his followers have shown a similar unwillingness to acknowledge unpleasant truths, admit mistakes, or learn from past experiences.

Things are somewhat better here in the United States, but its reluctance to analyze its own errors in a candid and tough-minded way is still a problem. Remember the much ballyhooed 9/11 Commission? Its final report contained a riveting narrative of the 9/11 plot but declined to pass judgment on any of the officials personally accountable for even so much as a lapse in judgment. There has been no official effort to assess the fateful decision to invade Iraq in 2003 or to take stock of the many strategy, judgment, and assessment errors that led to the subsequent failures there and in Afghanistan, along with the misleading reports of “progress” Americans were repeatedly subjected to. The U.S. Defense Department did investigate the notorious abuses at Abu Ghraib prison, but these efforts focused solely on local commanders or enlisted personnel and not on civilian officials whose policies made the abuses far more likely. The New York Times correctly judged the Army Inspector General’s report on Abu Ghraib to be a “300-page whitewash,” and Human Rights Watch called it out for avoiding “the logical conclusion that high-level military and civilian officials should be investigated for their role.”

Similarly, former U.S. President Barack Obama chose not to investigate the George W. Bush administration’s use of torture (in violation of both domestic and international law), and former U.S. Attorney General William Barr went to some lengths to undermine the investigation into former U.S. President Donald Trump’s own presidential misconduct. Efforts to teach a more complete and accurate history of racial injustice in the United States now face growing backlash intended not to correct historical errors or offer a fair-minded critique of so-called “critical race theory” but rather to sugarcoat a history that some Americans would prefer to ignore. How much do you want to bet there will never be a Challenger-style inquiry to figure out what the United States did wrong—and also what it did right—in responding to a pandemic that killed more Americans than World War I and II combined?

In the past, one could argue the United States didn’t need official historical investigations of this type. (And to be fair, the record of such efforts is mixed.) In an open society, one could instead rely on scholars and journalists—some of them protected by tenure and all of them by the First Amendment—to do the job instead. Although different people were bound to reach different conclusions on many issues, a scholarly consensus would emerge over time, and society as a whole would end up wiser. The evaluative process would never be perfect, but it was almost certainly better than a top-down effort to impose a official party line or suppress all inquiries into controversial policy decisions. To use late U.S. Supreme Court Justice Oliver Wendell Holmes’ memorable phrase, a democratic “marketplace of ideas” was preferable to a government monopoly.

I used to believe that view wholeheartedly, but growing evidence the “marketplace of ideas” is pretty easy to rig has undermined my confidence. One obvious problem is secrecy: It’s much harder to evaluate what is going wrong when governments can conceal what they are doing from the public (or when they selectively leak information to bolster their cases). The willingness of some officials to deceive the public in the past helped fuel the epidemic of conspiracy theories that now pollutes public discourse and encourages people to dismiss information they don’t like as nothing more than “fake news.” Trump took full advantage of this skepticism by using that phrase repeatedly, saying “I do it to discredit you all and demean you all so that when you write negative stories about me, no one will believe you.”

Second, the marketplace of ideas is skewed in favor of participants with the loudest megaphones and biggest budgets. Wealthy donors can influence academia in a variety of ways, and once proudly independent think tanks have become overly reliant on politically motivated donations, careless about disclosing conflicts of interest, or engaged in pay-to-play research. When well-funded interests can spend oodles of money to create and disseminate self-serving so-called “expert” testimony, what might appear to be an open debate on a level playing field is anything but. The ability to get at the truth declines along with public confidence in expertise.

The consequences go well beyond losing the ability to learn useful lessons from the past. Eroding confidence in government, the media, academia, or the think tank world becomes an excuse to ignore any information that challenges one’s own beliefs and to instead pick whatever version of so-called “truth” one finds congenial. Ultimately, people will prefer an obvious lie that is consistent with their prior beliefs than a well-verified truth that challenges them. Go far enough down that road and people who know absolutely nothing about infectious diseases or immunology start refusing to wear masks or declining to get vaccinated—or they embrace tin foil hat rumors about politicians and pedophile rings.

The marketplace of ideas breaks down even more when certain orthodoxies become so deeply engrained hardly anyone ever questions them, and those who do in a serious and sober way are soon dismissed or marginalized. As journalist Walter Lippmann warned, “When all think alike, no one thinks very much.” When conventional wisdoms go unquestioned, a powerful country can keep doing the same dumb things over and over because challenging the consensus is professionally risky. If you want a simple and parsimonious explanation for the recurring failures of post-Cold War U.S. foreign policy, start right there.

How does one tell a serious critique of prevailing orthodoxies from the whack-job notions that seem to have captured many minds? As I detailed a few years ago, serious contrarian critiques are evidence-based, and open about events or data that might cast doubt on their arguments. By contrast, conspiracy theories are inherently non-falsifiable; usually blame bad events on small, secretive, and evil cabals with vast but hidden powers; and offer elaborate explanations to explain away contrary evidence. For a contrarian, a lack of evidence for the argument is a problem; for a conspiracy theorist, the absence of supporting evidence is just confirmation of how powerful the conspiracy really is. Of course, once a country stops caring about actual evidence, its ability to evaluate and improve policy over time goes right out the window.

It might also be argued that for democracies, the real source of learning and accountability isn’t what happens in the public sphere; it’s what happens inside the voting booth. In this view, politicians who perform badly eventually have to face the voters, who have the chance to remove them from office and provide their successors with a mandate for change. In theory, the knowledge their actions will be judged at the next election gives even the most venal and dishonest politician some incentive to serve their constituents’ interests, lest they find themselves out of a job when the ballots are counted. Case in point: Trump really did lose the presidential election in 2020.

It’s a nice theory and still partly true, but the combination of gerrymandering, demographic patterns, voter suppression, and the built-in biases of the U.S. Constitution are creating a growing disconnect between who the majority of voters want and who actually gets elected. As a minority party that keeps moving further and further from the median voter, the last thing the Republican Party wants are fair elections that accurately translate public sentiment into electoral outcomes. As a result, this mechanism of self-correction is atrophying. As politicians become insulated from electoral defeat by the creation of safe seats, even a series of world-class screw-ups may not be enough to get them to change their views or policies—no matter how much evidence is amassed demonstrating they aren’t working.

What does any of this have to do with foreign policy? Plenty. Anything that interferes with a society’s ability to think carefully and critically about its past conduct and identify what is working and what isn’t reduces its capacity to learn from mistakes and makes it far more prone to repeat them. The process of evaluation has to be fair-minded, not just a partisan attempt to score points by conducting protracted witch hunts, purveying easily debunked untruths, or relying primarily on bad-faith arguments. I still believe open societies do a better job of self-criticism than dictatorships do, but that advantage may now be diminishing. If so, the United States’ ability to navigate a complex and challenging world will continue to decline.

Final note: Although I’m fortunate to be fully vaccinated at this point, the gloomy tone of this column suggests I could use a short break. I’ll be back in a few weeks. Until then, I’m counting on the rest of you to give me some good news to write about.

Stephen M. Walt is the Robert and Renée Belfer professor of international relations at Harvard University and a columnist for Foreign Policy.