A special anniversary report challenging the world's most dangerous thinking.
- By Andrew SwiftAndrew Swift is an editorial researcher at Foreign Policy.
View scenes from an unconventional world.
In Foreign Policy‘s first issue, published at the height of American exhaustion with the war in Vietnam, founders Samuel P. Huntington and Warren D. Manshel promised to challenge the prevailing orthodoxy in Washington. And with provocative essays from the likes of John Kenneth Galbraith — who famously coined the term “conventional wisdom” and spent a career fighting against it — and Richard Holbrooke — who as a serving Foreign Service officer ripped the State Department as “the machine that fails” — an insurgency was born. Forty years later, upending assumptions is embedded in FP‘s DNA. In that spirit, we offer this, our 40th Anniversary package tackling the world’s most dangerous conventional wisdoms.
- Economies Can’t Just Keep On Growing, by Thomas Homer-Dixon
- Homeland Security Hasn’t Made Us Safer, by Anne Applebaum
- China’s Rise Doesn’t Mean War…, by Joseph S. Nye Jr.
- …And China Isn’t Beating the U.S., by Daniel W. Drezner
- Understanding History Won’t Help Us Make Peace, by Aluf Benn
- America Pressures Israel Plenty, by Leslie H. Gelb
- Actually, the Retirement Age Is Too High, by James K. Galbraith
- The Rich Really Don’t Care About the Poor, by Carl Pope
- The Global Economy Won’t Recover, Now or Ever, by Immanuel Wallerstein
- Sovereignty Is Far From Dead, by Nina Hachigian
- Democracy Is Still Worth Fighting For, by Morton Halperin
- Sometimes, the Conventional Wisdom Is Right, by Stephen Sestanovich
BEHROUZ MEHRI/AFP/Getty Images
ECONOMIES CAN’T JUST KEEP ON GROWING
Humanity has made great strides over the past 2,000 years, and we often assume that our path, notwithstanding a few bumps along the way, goes ever upward. But we are wrong: Within this century, environmental and resource constraints will likely bring global economic growth to a halt.
Limits on available resources already restrict economic activity in many sectors, though their impact usually goes unacknowledged. Take rare-earth elements — minerals and oxides essential to the manufacture of many technologies. When China recently stopped exporting them, sudden shortages threatened to crimp a wide range of industries. Most commentators believed that the supply crunch would ease once new (or mothballed) rare-earth mines are opened. But such optimism overlooks a fundamental physical reality. As the best bodies of ore are exhausted, miners move on to less concentrated deposits in more difficult natural circumstances. These mines cause more pollution and require more energy. In other words, opening new rare-earth mines outside China will result in staggering environmental impact.
Or consider petroleum, which provides about 40 percent of the world’s commercial energy and more than 95 percent of its transportation energy. Oil companies generally have to work harder to get each new barrel of oil. The amount of energy they receive for each unit of energy they invest in drilling has dropped from 100 to 1 in Texas in the 1930s to about 15 to 1 in the continental United States today. The oil sands in Alberta, Canada, yield a return of only 4 to 1.
Coal and natural gas still have high energy yields. So, as oil becomes harder to get in coming decades, these energy sources will become increasingly vital to the global economy. But they’re fossil fuels, and burning them generates climate-changing carbon dioxide. If the World Bank’s projected rates for global economic growth hold steady, global output will have risen almost tenfold by 2100, to more than $600 trillion in today’s dollars. So even if countries make dramatic reductions in carbon emissions per dollar of GDP, global carbon dioxide emissions will triple from today’s level to more than 90 billion metric tons a year. Scientists tell us that tripling carbon emissions would cause such extreme heat waves, droughts, and storms that farmers would likely find they couldn’t produce the food needed for the world’s projected population of 9 billion people. Indeed, the economic damage caused by such climate change would probably, by itself, halt growth.
Humankind is in a box. For the 2.7 billion people now living on less than $2 a day, economic growth is essential to satisfying the most basic requirements of human dignity. And in much wealthier societies, people need growth to pay off their debts, support liberty, and maintain civil peace. To produce and sustain this growth, they must expend vast amounts of energy. Yet our best energy source — fossil fuel — is the main thing contributing to climate change, and climate change, if unchecked, will halt growth.
We can’t live with growth, and we can’t live without it. This contradiction is humankind’s biggest challenge this century, but as long as conventional wisdom holds that growth can continue forever, it’s a challenge we can’t possibly address.
Thomas Homer-Dixon is the CIGI chair of global systems at the Balsillie School of International Affairs in Waterloo, Canada.
Stuart Franklin/Magnum Photos
HOMELAND SECURITY HASN’T MADE US SAFER
Hardly anyone has seriously scrutinized either the priorities or the spending patterns of the U.S. Department of Homeland Security (DHS) and its junior partner, the Transportation Security Administration (TSA), since their hurried creation in the aftermath of the 9/11 attacks. Sure, they get criticized plenty. But year in, year out, they continue to grow faster and cost more — presumably because Americans think they are being protected from terrorism by all that spending. Yet there is no evidence whatsoever that the agencies are making Americans any safer.
DHS serves only one clear purpose: to provide unimaginable bonanzas for favored congressional districts around the United States, most of which face no statistically significant security threat at all. One thinks of the $436,504 that the Blackfeet Nation of Montana received in fiscal 2010 “to help strengthen the nation against risks associated with potential terrorist attacks”; the $1,000,000 that the village of Poynette, Wisconsin (pop. 2,266) received in fiscal 2009 for an “emergency operations center”; or the $67,000 worth of surveillance equipment purchased by Marin County, California, and discovered, still in its original packaging, four years later. And indeed, every U.S. state, no matter how landlocked or underpopulated, receives, by law, a fixed percentage of homeland security spending every year.
As for the TSA, I am not aware of a single bomber or bomb plot stopped by its time-wasting procedures. In fact, TSA screeners consistently fail to spot the majority of fake “bombs” and bomb parts the agency periodically plants to test their skills. In Los Angeles, whose airport was targeted by the “millennium plot” on New Year’s 2000, screeners failed some 75 percent of these tests.
Terrorists have been stopped since 2001 and plots prevented, but always by other means. After the Nigerian “underwear bomber” of Christmas Day 2009 was foiled, DHS Secretary Janet Napolitano claimed “the system worked” — but the bomber was caught by a passenger, not the feds. Richard Reid, the 2001 shoe bomber, was undone by an alert stewardess who smelled something funny. The 2006 Heathrow Airport plot was uncovered by an intelligence tip. Al Qaeda’s recent attempt to explode cargo planes was caught by a human intelligence source, not an X-ray machine. Yet the TSA responds to these events by placing restrictions on shoes, liquids, and now perhaps printer cartridges.
Given this reality — and given that 9/11 was, above all, a massive intelligence failure — wouldn’t we be safer if the vast budgets of TSA and its partners around the world were diverted away from confiscating nail scissors and toward creating better information systems and better intelligence? Imagine if security officers in Amsterdam had been made aware of the warnings the underwear bomber’s father gave to the U.S. Embassy in Abuja. Or, for that matter, if consular officers had prevented him from receiving a visa in the first place.
Better still, DHS could be broken up into its component parts, with special funding and planning carried out at the federal level only for cities and buildings that are actually at risk of terrorist attack. Here is the truth: New York City requires a lot more homeland security spending, per capita, than Poynette. Here is the even starker truth: Poynette needs no homeland security spending at all. The events of 9/11 did not prove that the United States needs to spend more on local police forces and fire brigades; they proved that Americans need to learn how to make better use of the information they have and apply it with speed and efficiency.
Anne Applebaum is a columnist for the Washington Post and Slate.
Chip Somodevilla/Getty Images
Joseph S. Nye Jr.
CHINA’S RISE DOESN’T MEAN WAR…
Thucydides famously attributed the Peloponnesian War to the rise in power of Athens and the fear it created in Sparta. A century ago, Germany’s rise and the fear it created in Britain helped cause World War I. Now, it’s become a new conventional wisdom in some circles that China’s rise and the fear it is creating in the United States — where recent polls show 60 percent believe the country is in decline — could doom the 21st century to a similar fate. As scholar John Mearsheimer has put it, China’s rise cannot be peaceful.
One should be skeptical about such dire projections. Americans go through cycles of declinism every decade or so, but that tells us more about America’s psychology than its power resources. Not only is the United States likely to remain the most powerful country in the first half of this century, but China still has a long way to go to catch up in military, economic, and soft power.
In contrast, Germany had already surpassed Britain in industrial power by 1900, and the kaiser was pursuing an adventurous, globally oriented foreign and military policy that was bound to bring about a clash. But China today has focused its policies primarily on its region and its own economic development. China’s “market-Leninist” economic model is attractive in authoritarian countries, but this so-called Beijing Consensus has the opposite effect in most democracies.
And even if China’s GDP passes U.S. GDP around 2027 (as Goldman Sachs now projects), the two economies would be equivalent in size, not equal in composition. China would still face massive rural poverty and enormous inequality, and it will begin to encounter demographic problems from the delayed effects of its one-child policy. Moreover, as countries develop, there is a natural tendency for growth rates to slow. By my calculations, if China’s annual growth goes down to 6 percent and the U.S. economy grows at 2 percent per year after 2030, China will not equal the United States in per capita income until decades later. So China is a long way from posing the kind of challenge to America that the kaiser’s Germany posed to Britain in 1900.
None of this means the dangers of conflict can be completely ruled out in Asia, as China’s recent disputes over various contested island chains remind us. But given shared global challenges like financial stability, cybercrime, nuclear proliferation, and climate change, China and the United States also have much to gain from working together. Unfortunately, faulty projections that create hubris among some Chinese and unnecessary fear of decline among some Americans could make it difficult to ensure this future.
Not every power’s rise leads to war — witness America’s peaceful overtaking of Britain at the end of the 19th century. So remembering Thucydides’s advice, it is important to prevent exaggerated fears from leading to a self-fulfilling prophecy. Or, to paraphrase Franklin D. Roosevelt, we can make ourselves safer by being wary of fear itself.
Joseph S. Nye Jr. is University Distinguished Service Professor at Harvard University and author of The Future of Power.
Bruno Barbey/Magnum Photos
Daniel W. Drezner
…AND CHINA ISN’T BEATING THE U.S.
China is a great power in every sense of the word. It is the most populous country in the world. The Middle Kingdom has weathered the Great Recession better than the West. It is developing a blue-water navy to rival the United States in the Pacific. In 2010, China surpassed Japan as the world’s second-largest economy. For many Americans, however, this is not enough. Politicians, commentators, and the public believe China has already supplanted the United States to achieve primacy in world politics. This is not only wrong — it is dangerously wrong.
According to a November 2009 Pew Research Center survey, 44 percent of Americans believe that China is “the world’s leading economic power,” compared with 27 percent who name the United States. Elites have fed this mass perception. After a midterm election cycle that featured anti-China ad after anti-China ad, President Barack Obama warned, “Other countries like China aren’t standing still, so we can’t stand still either.” With public perception and political rhetoric like this, it is little wonder that Forbes magazine recently named Chinese President Hu Jintao the world’s most powerful individual.
It’s time to make a few things clear. If one measures power strictly according to GDP at market exchange rates, then the United States is roughly 250 percent more powerful than China. If one uses a combination of metrics — as does, for example, the U.S. National Intelligence Council’s 2025 project — then China possesses a little less than half of America’s relative power. Even on the financial side, the U.S. still reigns, and, hype notwithstanding, the dollar is not going anywhere as the world’s reserve currency. The renminbi could be an alternative in the far future — but after the 2008 financial crisis, China is loath to open up its capital markets. Even by the less tangible metrics of soft power, the United States still outperforms China handily in new public opinion surveys from the Pacific Rim by the Chicago Council on Global Affairs.
Right now, the United States is vastly more powerful than the People’s Republic of China. Anyone telling you otherwise is selling you something.
Why the massive misperception? In part, people are looking at the wrong measures. China has the world’s largest currency reserves, leading many to conclude that Beijing now has the ability to dictate terms to the United States and everyone else. But that just ain’t so. The “balance of financial terror” constrains China as well as the United States because China needs American consumers at least as much as the United States needs China to buy its debt.
No doubt, China amassed more power while American might ebbed over the last decade, and Beijing is now throwing its weight around. But the United States still has a huge lead. As for China’s recent bout of belligerence, it has yielded Beijing little beyond Japan releasing a fishing-boat captain — while pushing South Asia and the Pacific Rim closer to the United States.
Exaggerating Chinese power has consequences. Inside the Beltway, attitudes about American hegemony have shifted from complacency to panic. Fearful politicians representing scared voters have an incentive to scapegoat or lash out against a rising power — to the detriment of all. Hysteria about Chinese power also provokes confusion and anger in China as Beijing is being asked to accept a burden it is not yet prepared to shoulder. China, after all, ranks 89th in the 2010 U.N. Human Development Index, just behind Turkmenistan and the Dominican Republic (the United States is fourth). Treating Beijing as more powerful than it is feeds Chinese bravado and insecurity at the same time. That is almost as dangerous a political cocktail as fear and panic.
Daniel W. Drezner, professor of international politics at Tufts University’s Fletcher School of Law and Diplomacy, blogs at ForeignPolicy.com.
Martin Parr/Magnum Photos
UNDERSTANDING HISTORY WON’T HELP US MAKE PEACE
We’re often told that understanding history will help us better understand the present. But the past can be a very dangerous place. Just look at the world’s longest-lasting conflicts — between Palestinians and Israelis, Indians and Pakistanis, the peoples of the Balkans. All involve long, and unhealthy, glances in the rearview mirror.
Where I come from, the political culture has been shaped by incessant digging in history, aimed at supporting the dueling narratives in the Palestine conflict. My inbox often collapses under the weight of contradicting evidence that either the Jews or the Arabs lived in the Holy Land before the other. There is hardly a scientific discipline that has not been invoked to support conflicting claims about the past, from archaeology and philology to biology and genetics.
Resolving the mystery of who was here first has become an obsession because it seems to offer a final judgment on who is right and who is wrong, on who were the indigenous people of Palestine and who were the usurpers. Alas, whenever one side boasts about the ultimate proof, its adversary produces another, better one. A 2003 Israeli textbook aimed at teaching the conflicting narratives side by side shows how pointless our debates have become: The Jewish narrative relies on the Bible to link today’s Israelis to the ancient Israelites while the Palestinian counternarrative reaches back to the Jebusites, who ruled Jerusalem before King David’s occupation, as the forefathers of contemporary Palestinians.
Unfortunately, this pseudo-historical dispute lies at the heart of the current political debate. At the most crucial moment of the 2000 Camp David summit, for example, Palestinian leader Yasir Arafat argued with Israeli Prime Minister Ehud Barak and U.S. President Bill Clinton about whether a Jewish temple preceded the Muslim shrines on the Jerusalem site known as the Temple Mount or Haram al-Sharif. Having failed to resolve the present-day conflict, the leaders retreated to a pointless debate about history. Their failure led to the bloodbath of the Second Intifada.
The current Israeli and Palestinian leaders are similarly obsessed. Prime Minister Benjamin Netanyahu, son of an eminent historian, demands Palestinian recognition of Israel as “the state of the Jewish people,” arguing that only such recognition could end the conflict. Recently, he declared Rachel’s Tomb in Bethlehem and the Cave of the Patriarchs in Hebron as “national heritage sites.” When UNESCO argued that these sites reside in occupied Palestinian territory, Netanyahu blamed it for trying to “detach the people of Israel from its heritage.” For his part, Palestinian President Mahmoud Abbas focuses on his people’s victimhood and seeks “justice” for past abuses, the 1948 exile foremost among them.
Because accepting the other side’s narrative amounts to destroying your own, there can be no compromise.
This has not always been so. When Israeli Prime Minister Menachem Begin and Egyptian President Anwar Sadat negotiated their peace deal in the late 1970s, they put the past aside and constructed a new relationship rather than fruitlessly debate who had been the aggressor and who had been the victim. When Arafat and Prime Minister Yitzhak Rabin signed the 1993 Oslo Accords, they wasted little time on history lessons and focused instead on building a future. But Rabin’s 1995 assassination and the ensuing collapse of Oslo brought the ghosts back to the negotiating table, where they have stayed ever since.
In Europe, the belligerent past is visible everywhere, but contemporary European politicians wisely ignore it and look forward. Would that our squabbling leaders in the Middle East could do the same.
Aluf Benn is editor at large of the Israeli daily Haaretz.
Christopher Anderson/Magnum Photos
Leslie H. Gelb
AMERICA PRESSURES ISRAEL PLENTY
Scholars, pundits, propagandists, and journalists have created two dangerous pieces of conventional wisdom about the Middle East: that Israelis, not Palestinians, have been the main stumbling block to peace, and that the United States has failed to use its influence to pressure Israel for serious compromises. Both propositions are largely untrue. If uncorrected, these myths could make both Palestinians and Israelis feel irretrievably wronged and unwilling ever to negotiate in good faith.
Israel has a long and compelling history of making major concessions to Arabs. Egyptian President Anwar Sadat went to Jerusalem in 1977, and less than two years later, Israel agreed to return the entire Sinai Peninsula, booty of a war it did not start and an act of territorial generosity unprecedented in modern history. Israelis negotiated with Palestinian leader Yasir Arafat, whom they rightly considered a terrorist. At the end of U.S. President Bill Clinton’s administration, Israeli Prime Minister Ehud Barak offered the Palestinians several key concessions, including more than 90 percent of the West Bank. After the Annapolis initiative put forward by George W. Bush’s administration, one of Barak’s successors, Ehud Olmert, upped the offerings considerably: more of the West Bank, a sliver of Arab East Jerusalem for a Palestinian capital, and a land swap to give the new Palestinian state a land link to Gaza. Olmert even privately accepted “the right of return” to Israel for a certain number of Palestinian refugees. To both generous proposals, the Palestinians said either no or nothing.
In return, the Israelis received little, and the Palestinians insisted on more of everything. Their rationale was that they needed further concessions to compensate for Israeli demands limiting the size and capability of Palestinian security forces. These restrictions, they said, would undermine the very feasibility of a sovereign Palestinian state. The Israelis argued that they needed these added protections because Israel could not count on Arabs to accept Israel’s existence. They cited the Palestinians’ rejection of a Jewish history in Israel and even any Jewish connection to the Temple Mount in the heart of Jerusalem. For good measure, the Palestinians still refuse to recognize Israel as a “Jewish state.” Nor do the Palestinians acknowledge that when Israel departed Gaza in 2005, it uprooted 9,000 Israeli settlers. In return, Israel got rockets and a terrorist enclave run by Hamas.
At each step in this tortuous negotiating process, the United States has pushed and pulled Israel toward concessions, but received little or no credit from the Arab side. Sometimes this pressure has been public, as in President Barack Obama’s recent scolding of Israel over its West Bank settlements, but more often it has been private. Yet Arabs have not wanted to credit Washington’s role as a peacemaker because they think the United States is capable of exerting even more pressure on Israel. Nonetheless, the American role has been real and substantial.
Israel has only made this world of misperceptions worse. It has explained its concessions badly, if at all. Consider, for example, how Israeli governments refuse to tout their history of concessions on the West Bank for fear these would be taken as starting points in ongoing negotiations. Apparently, Israelis would rather look guilty than weak.
Israelis certainly deserve criticism for continuing their West Bank settlements. But they deserve credit for their concrete efforts to make peace. And so does the United States. Yet the myths prevail, and dangerously so.
Leslie H. Gelb is president emeritus of the Council on Foreign Relations, a former senior U.S. government official, and a former columnist for the New York Times.
Paolo Pellegrin/Magnum Photos
James K. Galbraith
ACTUALLY, THE RETIREMENT AGE IS TOO HIGH
The most dangerous conventional wisdom in the world today is the idea that with an older population, people must work longer and retire with less.
This idea is being used to rationalize cuts in old-age benefits in numerous advanced countries — most recently in France, and soon in the United States. The cuts are disguised as increases in the minimum retirement age or as increases in the age at which full pensions will be paid.
Such cuts have a perversely powerful logic: “We” are living longer. There are fewer workers to support each elderly person. Therefore “we” should work longer.
But in the first place, “we” are not living longer. Wealthier elderly are; the non-wealthy not so much. Raising the retirement age cuts benefits for those who can’t wait to retire and who often won’t live long. Meanwhile, richer people with soft jobs work on: For them, it’s an easy call.
Second, many workers retire because they can’t find jobs. They’re unemployed — or expect to become so. Extending the retirement age for them just means a longer job search, a futile waste of time and effort.
Third, we don’t need the workers. Productivity gains and cheap imports mean that we can and do enjoy far more farm and factory goods than our forebears, with much less effort. Only a small fraction of today’s workers make things. Our problem is finding worthwhile work for people to do, not finding workers to produce the goods we consume.
In the United States, the financial crisis has left the country with 11 million fewer jobs than Americans need now. No matter how aggressive the policy, we are not going to find 11 million new jobs soon. So common sense suggests we should make some decisions about who should have the first crack: older people, who have already worked three or four decades at hard jobs? Or younger people, many just out of school, with fresh skills and ambitions?
The answer is obvious. Older people who would like to retire and would do so if they could afford it should get some help. The right step is to reduce, not increase, the full-benefits retirement age. As a rough cut, why not enact a three-year window during which the age for receiving full Social Security benefits would drop to 62 — providing a voluntary, one-time, grab-it-now bonus for leaving work? Let them go home! With a secure pension and medical care, they will be happier. Young people who need work will be happier. And there will also be more jobs. With pension security, older people will consume services until the end of their lives. They will become, each and every one, an employer.
A proposal like this could transform a miserable jobs picture into a tolerable one, at a single stroke.
James K. Galbraith is author of The Predator State: How Conservatives Abandoned the Free Market and Why Liberals Should Too.
Alec Soth/Magnum Photos
THE RICH REALLY DON’T CARE ABOUT THE POOR
We spend far too much time and energy worrying about the supposed global divide between north and south, rich country and poor country. It doesn’t actually exist. The planet’s real fault line is between elites and the middle class in some countries, and the bottom of the pyramid, everywhere.
The world’s four richest citizens — Carlos Slim, Bill Gates, Warren Buffett, and Mukesh Ambani — have more in common with each other than they do with the bottom strata of their respective countries. Yes, they do handle their wealth differently. Gates and Buffett are giving most of it away, Ambani just built the world’s most expensive house, and Slim is somewhere in the middle. But all four can count on their home governments to take care of their needs first. Preserving that kind of social hierarchy is an unwritten assumption in deciding which solutions to the world’s problems arrive on the table and which do not.
Many have observed that countries whose boundaries happen to include large deposits of oil, diamonds, tropical timber, or some other valuable commodity tend to have miserable populations that suffer from poverty and violence — the “resource curse.” But we politely overlook the reality that for every resource-cursed country, there is a resource-blessed kleptocracy. With rare exceptions like Sudan, those who pillage their countries’ wealth are accepted into the top ends of global society. They come to Davos, stay at the Four Seasons, bank in Switzerland. The oil oligarchs of the Persian Gulf are welcomed as investors in News Corporation and American banks, even when they hold views that might otherwise put them on a U.S. terrorism watch list.
India, justifiably, comes to global climate negotiations to argue that it has hundreds of thousands of villages with no access at all to electricity and that the United States and Europe cannot reasonably say, “Well, given the climate crisis, those villages are just going to remain in the dark.” That reality gets a lot of attention. But the other reality is that India devotes very little of its energy investment to getting light for those villages, while it invests considerably more in keeping Ambani’s streetlights on. India is not exceptional. In fact, the poor pay 20 percent of the world’s lighting bill — and get only 0.1 percent of the world’s lighting services in exchange.Seventy-five years after Franklin D. Roosevelt demonstrated that reliable access to credit was the key to electrifying rural households, the United States still has a shameful number of off-grid communities on Native American reservations.
We write and talk glibly about the increasing emphasis in our economies on “knowledge,” but rarely focus on the reality that a knowledge-based society makes it far easier for much of the workforce to be left behind. Ugandan coffee farmers receive only 2.5 percent of the British retail price and 4.5 percent of the U.S. retail price for their coffee.A few cents added to a cup of coffee or a basket of strawberries would cover the costs of doubling or tripling the wages of peasant farmers. But if we raised prices in today’s world economy, the increases would be absorbed not by the farmers but by marketing, wholesaling, and retailing markups.
The knowledge workers and investors who benefit from this global supply chain have far more in common with each other than they do with the peasant coffee growers who supply their corner Starbucks. Enriching them would mean lowering the status and wealth of bankers, distributors, and advertisers — and they’ve got all the leverage.
Together, Slim, Gates, Buffett, and Ambani control more wealth than the world’s poorest 57 countries. The danger is that while we have a global economy that knows how to concentrate money and power in an ever smaller set of hands, we have no robust mechanism to alert us to the injustice, dangers, and instability that come along with this package. Someday, to our peril, the poor will find their own way to remind us.
Carl Pope is chairman of the Sierra Club.
Mark Power/Magnum Photos
THE GLOBAL ECONOMY WON’T RECOVER, NOW OR EVER
Virtually everyone everywhere-economists, politicians, pundits — agrees that the world has been in some kind of economic trouble since at least 2008. And virtually everyone seems to believe that in the next few years the world will somehow “recover” from these difficulties. After all, upturns always occur after downturns. The remedies recommended vary considerably, but the idea that the system shall continue in its essential features is a deeply rooted faith.
But it is wrong. All systems have lives. When their processes move too far from equilibrium, they fluctuate chaotically and bifurcate. Our existing system, what I call a capitalist world-economy, has been in existence for some 500 years and has for at least a century encompassed the entire globe. It has functioned remarkably well. But like all systems, it has moved steadily further and further from equilibrium. For a while now, it has moved too far from equilibrium, such that it is today in structural crisis.
The problem is that the basic costs of all production have risen remarkably. There are the personnel expenses of all kinds — for unskilled workers, for cadres, for top-level management. There are the costs incurred as producers pass on the costs of their production to the rest of us — for detoxification, for renewal of resources, for infrastructure. And the democratization of the world has led to demands for more and more education, more and more health provisions, and more and more guarantees of lifetime income. To meet these demands, there has been a significant increase in taxation of all kinds. Together, these costs have risen beyond the point that permits serious capital accumulation. Why not then simply raise prices? Because there are limits beyond which one cannot push their level. It is called the elasticity of demand. The result is a growing profit squeeze, which is reaching a point where the game is not worth the candle.
What we are witnessing as a result is chaotic fluctuations of all kinds — economic, political, sociocultural. These fluctuations cannot easily be controlled by public policy. The result is ever greater uncertainty about all kinds of short-term decision-making, as well as frantic realignments of every variety. Doubt feeds on itself as we search for ways out of the menacing uncertainty posed by terrorism, climate change, pandemics, and nuclear proliferation.
The only sure thing is that the present system cannot continue. The fundamental political struggle is over what kind of system will replace capitalism, not whether it should survive. The choice is between a new system that replicates some of the present system’s essential features of hierarchy and polarization and one that is relatively democratic and egalitarian.
The extraordinary expansion of the world-economy in the postwar years (more or less 1945 to 1970) has been followed by a long period of economic stagnation in which the basic source of gain has been rank speculation sustained by successive indebtednesses. The latest financial crisis didn’t bring down this system; it merely exposed it as hollow. Our recent “difficulties” are merely the next-to-last bubble in a process of boom and bust the world-system has been undergoing since around 1970. The last bubble will be state indebtednesses, including in the so-called emerging economies, leading to bankruptcies.
Most people do not recognize — or refuse to recognize — these realities. It is wrenching to accept that the historical system in which we are living is in structural crisis and will not survive.
Meanwhile, the system proceeds by its accepted rules. We meet at G-20 sessions and seek a futile consensus. We speculate on the markets. We “develop” our economies in whatever way we can. All this activity simply accentuates the structural crisis. The real action, the struggle over what new system will be created, is elsewhere.
Immanuel Wallerstein is a senior research scholar at Yale University.
Jonas Bendiksen/Magnum Photos
SOVEREIGNTY IS FAR FROM DEAD
In his latest book, How Barack Obama Is Endangering Our National Sovereignty, John Bolton lays out what has become a consensus view on the American right. Those who argue that the United States must engage with international organizations to address global problems, he argues, are really saying we should “cede some of our sovereignty to institutions that other nations will also influence.” And that, warns this U.N.-bashing former Bush administration ambassador to the United Nations, “is unquestionably a formula for reducing U.S. autonomy and reducing our control over government.”
One wonders just where Ambassador Bolton has been for the past 362 years. Here’s the truth: The United States regularly contravenes the 17th-century view of countries as autonomous entities, free of outside interference, and instead works with other countries to bring opportunity and greater safety to Americans. Asserting independence remains a preoccupation of some U.S. politicians-not to mention authoritarian leaders around the world. But their brittle interpretation of sovereignty is an old-fashioned, and even dangerous, notion in world affairs.
Stephen D. Krasner, a former top State Department official, argues that from the very beginning, this absolutist definition of sovereignty, which dates back to the 1648 Peace of Westphalia, has been, at best, “organized hypocrisy.” After all, simply mailing a letter abroad requires countries to follow a common set of postal rules that no single one of them controls.
But globalization has increased the pace of sovereignty’s erosion. Today, the United States lets external actors affect its internal decisions all the time; there are simply too many benefits to be gained in return for agreeing to a common set of trade rules so that, for example, Americans can profit from exporting farm machinery and eat bananas year-round. To settle disputes that could flare into costly trade wars, the United States submits to arbitration under the World Trade Organization. The likelihood of a nuclear accident or terrorist incident has gone down thanks to the Nuclear Non-Proliferation Treaty, which requires the United States and the other 188 signatories to allow inspectors from the International Atomic Energy Agency at their nuclear facilities.
Nevertheless, Bolton and others continue to make hay of “permission slips,” pillory the international organizations that carry America’s water far more often than not, and warn of dire consequences from just about any treaty that requires U.S. compliance, such as the New start nuclear-arms deal between the United States and Russia or the Law of the Sea Treaty. They imply, absurdly, that a little sovereignty offered up here and there, and soon the French will be drafting U.S. zoning regulations.
The real problem is not that norms of sovereignty are changing in the United States, but that they are not changing fast enough elsewhere. China, for example, clings to a very traditional, absolutist view that the U.S. right might appreciate. While Beijing has been the No. 1 beneficiary of globalization’s international rules and treaties, it often demands at the same time that the world mind its own business. Awarding the Nobel Peace Prize to jailed dissident Liu Xiaobo, a Chinese Foreign Ministry official insisted, was “a violation of China’s judicial sovereignty.” A major sticking point in climate-change negotiations has been China’s refusal to allow international inspectors to verify reductions in its carbon dioxide emissions, and the head of its delegation at the 2009 Copenhagen climate-change summit explicitly invoked sovereignty to explain that stance. Chinese leaders have said the value of the yuan is a sovereign issue for Beijing-even though other countries, notably the United States, suffer from its artificially low level.
Despite the need for countries to be more flexible with their sovereignty, the nation-state is alive and well. Its authority to make decisions for the benefit of its people is unassailable. National governments retain the right to control their borders and govern as they wish-so long as they don’t commit mass atrocities. States are still the main actors in international affairs, albeit under guidelines that they do not fully control individually.
And that’s OK. The United States has to be the role model for a pragmatic, progressive view of sovereignty. If Americans cling to outmoded notions of national autonomy, they will be leading themselves, and the world, down a path of emboldened threats, stifled cooperation, and missed opportunities.
Nina Hachigian is a senior fellow at the Center for American Progress and co-author of The Next American Century: How the U.S. Can Thrive as Other Powers Rise.
Morton H. Halperin
DEMOCRACY IS STILL WORTH FIGHTING FOR
As the United States struggles to wind down two wars and recover from a humbling financial crisis, realism is enjoying a renaissance. Afghanistan and Iraq bear scant resemblance to the democracies we were promised. The Treasury is broke. And America has a president, Barack Obama, who once compared his foreign-policy philosophy to the realism of theologian Reinhold Niebuhr: “There’s serious evil in the world, and hardship and pain,” Obama said during his 2008 campaign. “And we should be humble and modest in our belief we can eliminate those things.”
But one can take such words of wisdom to the extreme-as realists like former Secretary of State Henry Kissinger and writer Robert Kaplan sometimes do, arguing that the United States can’t afford the risks inherent in supporting democracy and human rights around the world. Others, such as cultural historian Jacques Barzun, go even further, saying that America can’t export democracy at all, “because it is not an ideology but a wayward historical development.” Taken too far, such realist absolutism can be just as dangerous, and wrong, as neoconservative hubris.
For there is one thing the neocons get right: As I argue in The Democracy Advantage, democratic governments are more likely than autocratic regimes to engage in conduct that advances U.S. interests and avoids situations that pose a threat to peace and security. Democratic states are more likely to develop and to avoid famines and economic collapse. They are also less likely to become failed states or suffer a civil war. Democratic states are also more likely to cooperate in dealing with security issues, such as terrorism and proliferation of weapons of mass destruction.
As the bloody aftermath of the Iraq invasion painfully shows, democracy cannot be imposed from the outside by force or coercion. It must come from the people of a nation working to get on the path of democracy and then adopting the policies necessary to remain on that path. But we should be careful about overlearning the lessons of Iraq. In fact, the outside world can make an enormous difference in whether such efforts succeed. There are numerous examples-starting with Spain and Portugal and spreading to Eastern Europe, Latin America, and Asia-in which the struggle to establish democracy and advance human rights received critical support from multilateral bodies, including the United Nations, as well as from regional organizations, democratic governments, and private groups. It is very much in America’s interest to provide such assistance now to new democracies, such as Indonesia, Liberia, and Nepal, and to stand with those advocating democracy in countries such as Belarus, Burma, and China.
It will still be true that the United States will sometimes need to work with a nondemocratic regime to secure an immediate objective, such as use of a military base to support the U.S. mission in Afghanistan, or in the case of Russia, to sign an arms-control treaty. None of that, however, should come at the expense of speaking out in support of those struggling for their rights. Nor should we doubt that America would be more secure if they succeed.
Morton H. Halperin is senior advisor to the Open Society Institute and co-author of The Democracy Advantage.
SOMETIMES, THE CONVENTIONAL WISDOM IS RIGHT
The old battle between “conventional wisdom” and its debunkers isn’t what it used to be. When liberal economist John Kenneth Galbraith started using the term in the 1950s, his targets were not just any widely held wrong opinions, but those that were the product of inertia and convenience. Conventional wisdom, Galbraith thought, reinforced complacency. It enabled us to “avoid awkward effort or unwelcome dislocation of life,” he wrote in The Affluent Society.
Since then, of course, we’ve all become debunkers. (Just google the term “conventional wisdom watch” — virtually everyone is on alert for the stuff.) But something has happened to the mainstream ideas that need to be shaken up. Today’s conventional wisdom isn’t complacent. Its hair is on fire. It wants us to see that climate change threatens all humanity, that the United States is in decline, that we’re all going to have to work harder and longer in the future, that China is going to rule the world — unless perhaps a nuclear-armed Iran gets there first.
The debunking impulse has also changed. Far from trying to rouse us from slumber, its role is now to offer us a cup of warm milk before bed. If you doubt this, just dip into the debate about some important element of contemporary conventional wisdom, like the idea that America’s global dominance is eroding. This may seem so obvious as barely to require discussion. Yet some of our best, most independent-minded commentators on international politics, scholars and practitioners alike, are saying that your eyes deceive you: America’s decline is definitely not inevitable, and might not be happening at all.
These commentators acknowledge, of course, that preventing decline will oblige Americans to put their economic house in order and repair the damage done to their country’s image worldwide — all while preserving its global military edge and managing its alliances better than any president has done in the last 20 years. (And that’s just for starters!) But too often these little details are in the fine print. The heart of the debunking message — and what most readers carry away from it — is reassurance. America will be No. 1 for decades to come.
You see the same pattern in debates on other issues, and it doesn’t matter whether the left or the right holds the high ground. The earlier consensus on Afghanistan has begun to break up, for example. But it’s still probably conventional wisdom, at least in Washington, to favor continuing the war, given the damage that retreat would do to U.S. interests across the Middle East and South Asia. The debunkers, by contrast, reassure us that withdrawal won’t be so bad. If, as they now seem to have decided, defeat in Vietnam was no big deal, why should Afghanistan be any different?
Conventional wisdom is rarely good at explaining itself, least of all when its message is that the United States faces one big challenge after another and that they can’t be successfully addressed without “unwelcome dislocation of life.” The debunkers may well be right that America can avoid decline, but only by dint of gigantic effort. They might also be right that President Barack Obama or his successor can find ways to limit the damage of withdrawal from Afghanistan, but doing so will surely require more resources, focus, and commitment than the debunkers foresee. And that holds for other issues, whether it’s coping with a nuclear Iran, the economic crisis, or climate change.
The most dangerous idea Americans face these days is that they can do less (or do nothing) and still get by. On this question, the conventional wisdom — in all its hair-on-fire banality — is absolutely right.
Stephen Sestanovich is a professor at Columbia University’s School of International and Public Affairs and a senior fellow at the Council on Foreign Relations.