Soldiers participating in Operation Enduring Freedom in 2002 outside of Shinkay, Afghanistan (SSG Leopold Medina Jr./Wikimedia Commons)

At the dawn of a new century, Commonweal editors welcomed a respite from “all the yammering” about the millennium. “From the interminable impeachment of William Jefferson Clinton,” they wrote in January 2000, “to the false apocalypse of Y2K…and the promise of a Donald Trump presidential bid, 1999 threatened not only to usher in a new century but to last a century as well.” 

 

Eager to focus on more pressing concerns—including third-world debt relief, criminal-justice reform, and the Israel-Palestine conflict—editors gratefully bid farewell to the millennial hoopla and re-focused their attention on the similarly hyped “Bridge to the 21st Century,” warning that “no one knows what the future holds, except that it holds a good deal of the past.”

 

A period of American triumphalism followed the collapse of the Soviet Union in 1989 and lasted throughout the Clinton presidency. As the world’s lone remaining superpower, the United States repeatedly asserted its global hegemony by trying to spread democratic values and free-market capitalism to virtually every country in the world. The “peace and prosperity” that followed was heralded as both a sign of globalization’s transformative impact and liberal democracy’s historical inevitability—or, as Francis Fukuyama famously phrased it, “the end point of mankind’s ideological evolution.”

 

In reality, globalization had already started to exacerbate economic inequality in the United States, and the U.S. military had to intervene in Somalia, East Timor, Bosnia, Iraq, and other countries to keep an increasingly precarious peace and maintain America’s global dominance. Meanwhile, in Rwanda, Sierra Leone, and other war-ravaged countries, atrocities continued without a firm response from Washington.

 

As the United States increasingly exerted its power, it became evident that many of the problems of the previous century would continue into this one, and that the United States would still bear a great deal of the responsibility for them. Commonweal did not hesitate to challenge the U.S. government’s increasing unilateralism, or to draw attention to the political and cultural disruptions it produced. In the aftermath of the September 11 terrorist attacks, Commonweal contributors called out the excesses and mystifications of George W. Bush’s “global war on terror,” the effects of which are still being felt today. 

 

In the years that followed, the magazine featured articles and essays that tried to make sense of a much more complicated and troubled world than the one most Americans had expected at the beginning of the decade. Commonweal’s editors and contributors examined the wars in Iraq and Afghanistan, the torture of prisoners in Abu Ghraib and other CIA “black sites,” and the indefinite imprisonment of “enemy combatants” in Guantánamo Bay—all of which undermined America’s moral authority and the logic of its foreign policy.

 

Here, on the occasion of our centennial, we present “American Destiny,” a 2002 essay by former Commonweal editor and longtime contributor William Pfaff, once described by Arthur M. Schlesinger Jr. as Walter Lippmann’s authentic heir because he “place[d] the rush of events in historical and cultural perspective and [wrote] about them with lucidity and grace.”

Contrary to what usually is claimed in Washington, American foreign policy has rarely been directed by calculations of national interest. That policy is shaped by nationalism, fear, consideration of commercial interests, political ideology, or prejudice, but—for better or worse—the dominant force has always been a vision of national destiny, into which all the rest is subsumed. This sense of destiny has to be understood in order to acquire a sense of where the country is now headed. The victory in Afghanistan, following the Bush administration’s declaration of war on terrorism in September, provides a tangible realization of ideas that have a long history in the United States. 

American foreign policy has always rested on the belief that modernization, Westernization, and Americanization are integrally related and unalloyed benefits, necessary factors in the establishment of good order in human society. In contrast, terrorism—violence against civilians in a political cause—is understood as an expression of disorder. 

From the beginning, the American nation has operated on the conviction that it is destined to lead the way for humanity. This has been fundamental to the American conception of the nation’s historical role ever since 1629 when persecuted Anglican dissenters assumed control of the Massachusetts Bay Company and set out to establish on virgin land a community formed in a new religious dispensation, meant to provide a model for humanity. From this moral and intellectual foundation, the logical conclusion Americans have drawn has been that the world is eventually destined to become integrated into an elaboration of the American system, whose superiority is additionally demonstrated today by its economic strength and productivity, and its technological dynamism and unmatched record of innovation. 

The superiority of American political values and standards is taken to be self-evident. The triumph of the American model and the fall of its social and political rival, the Soviet Union, are understood as demonstrations of the inevitable emergence of a new form of international society guided if not ultimately governed by the United States. 

The debate in the United States over extending the so-called war against terrorism to countries other than Afghanistan ended with the Bush administration’s decision to intervene in the Philippines. If we are to believe the president’s State of the Union declarations concerning what he described as the “axis of evil,” Iraq, Iran, and North Korea (formerly described as rogue nations) together with what remains of the Al Qaeda camps and membership, more is to come. Once again a long struggle is promised, and an unending national mobilization demanded. 

It remains unclear, however, exactly what this means. It is possible that this course, whatever it does prove to mean in practical terms, could ultimately have consequences unacceptable to the American public, and there could be eventual recoil. Frustration or defeat in Vietnam, Lebanon, and Somalia caused past American retreats from foreign involvements. (However, it is significant that those episodes failed to produce a permanent policy redirection; since World War II, interventionism has remained the dominant tendency.) 

Global economic intervention, the promotion of a “globalized” international economy, has been Washington’s policy since the 1980s, but simultaneously there has been a less well-publicized globalization of the American military presence. Since the end of the cold war the United States has steadily built up an international military infrastructure of regional commands, base agreements, and relationships with foreign military forces that assumes a permanent and global American military presence. U.S. forces are currently deployed in some forty countries, and there are intimate command and staff relationships with exchanges and training missions in nearly all of Europe and the former Soviet states, and in much of Asia and parts of Latin America. 

Before the Afghanistan war was even over, construction had begun on American installations in Uzbekistan and Pakistan. These were meant to support an American presence in Central Asia which, according to the Pentagon at the time, “could last for years.” The State Department declared that it had found “a new commitment—a recommitment—of the government of Uzbekistan” to democracy, and U.S. aid to that country in 2002 will be triple that of last year. (After Russia indicated its hostility to the extension of the American military base system to this former Soviet republic, the United States denied that the bases were meant to be permanent.) 

The current strategic doctrine of the Pentagon is to keep the battlefield as far as possible from the United States, restoring that defensive distance so dramatically lost on September 11, and achieving everywhere in the world, in the Pentagon’s phrase, “full-spectrum dominance.” 

The current strategic doctrine of the Pentagon is to keep the battlefield as far as possible from the United States.

 

It is nonetheless important to note that the United States’ underlying or “normal” and historical relationship with the external world has been isolationist and morally isolated, in contrast to today’s energetic commitment to intervention. However, the national separatism of the nineteenth century was itself an expression of national mission, being intended to protect the country from contamination by the old order of “power politics.” The internalized American consciousness responsible for the country’s historical isolationism was formed by the experience of geographical as well as political isolation, the latter fundamentally linked to the nature of the American political experiment, itself meant to break with European history and create a new and redemptive political association of free men and women, untainted by the Old World’s history. 

Though two world wars were necessary to force the United States out of that isolationism, Americans had already shown dissatisfaction with the policy before the end of the nineteenth century. The war with Spain in 1898, entered into more for moralistic and ideological than political reasons (accompanied by no little religious prejudice against Catholic Spain), marked out an imperialist and colonialist ambition, in the fashion of the period. The United States acquired control of the Philippines, Guam, and Puerto Rico (plus Hawaii, which had nothing to do with Spain), while exercising effective control of a nominally independent Cuba (until 1934, the Platt Amendment gave Washington a right to unilateral intervention). Colonialism proved controversial, however, but while the United States decided in 1934 that the Philippines should be given eventual independence, American military bases were not withdrawn until the 1990s. A new military relationship has now been created there in the antiterrorist cause. 

The isolationist impulse was reawakened by World War I, which the United States entered only after Woodrow Wilson had identified intervention as the war to end war. After the war and the death of Wilson, a disillusioned Congress refused to make a permanent commitment to Western Europe’s security. The fear of being corrupted by a European engagement persisted. There was no great enthusiasm for intervention in World War II until Japan bombed Pearl Harbor, providing the casus belli in the Pacific, and Hitler, honoring his obligation to Japan, drew the United States into the war in Europe. A precipitous military demobilization took place just after World War II, halted only when threatening Soviet conduct, followed by the 1948 Communist coup in Czechoslovakia and the attack in Korea, confirmed the arrival of the cold war. The liberal internationalism which followed lasted until the Soviet collapse, although increasingly affected by American exceptionalism and unilateralism, indirect expressions of the isolationist temperament. The latter qualities have now prevailed in the administration of George W. Bush. 

The new president took office declaring the cold war finished and insisting that American policy would henceforth be governed by a strict conception of national interest. That seemed reversed in the immediate aftermath of September 11, but it rapidly became apparent that, while the United States would turn to alliances to buttress its political position, its intention was to act entirely on its own. The war against terrorism has been unilateralist in conception and execution, and Washington’s principal allies have become increasingly troubled by the conduct and extension of that war. 

 

From 1990 and the collapse of the Soviet Union until this past September, the most important element in the American relationship with other nations was economic and commercial policy. General deregulation of the world economy, pursuit of open international investment and trade regimes, and the promotion of American-style corporate management and business practices were seen as the way to promote international progress towards what Francis Fukuyama called the “end of history.” The main emphasis in foreign policy became deregulation and trade liberalization. The end of the cold war had shifted the country’s national perception of interest from the military and political spheres to economic and commercial ones, to which the first Bush and the Clinton administrations responded, both of them dominated by business interests. 

During the 1990s globalization became a generalized phenomenon. Originally, it had been a technology-
driven integration of societies and economies, expressing forces that were politically, economically—and morally—neutral. In essential respects, the process hardly differed from the technological transfers of the past, which sovereign political societies and economies accommodated with generally positive effect, and without being subverted. 

What happened in the 1990s was that the United States mobilized its immense political as well as economic power to deregulate the international economy, opening foreign economies and industries to American investment or ownership, making foreign markets, raw materials, and labor available to American business, so as to expand an international free-trade economy operating under essentially American norms and responsive to American interests. Globalization thus began to resemble colonialism (which itself was originally conceived as a force for progress). The goal of globalism is Utopian (as John Gray of the London School of Economics has argued), in that a universal economy, freed from government regulation, self-governed by the supposedly impartial and uniquely efficient mechanisms of the market, is held to generate the greatest overall wealth, measured as productivity and gross national product, and therefore, in principle, to produce the greatest material well-being for the greatest number of people. 

Such a result, even in theory, could only be achieved at the expense of diversity and pluralism in the economic sphere, undermining or destroying both in the course of the drive for productivity and maximum stockholder return (the postulated measure of efficiency) in not only manufacturing but all that vast zone of human culture directly connected with economic activity, including not only popular entertainment, broadcasting, publishing, and journalism, but the arts—and perhaps most significant, if indirectly, politics. 

By its nature, globalization is disruptive, in that it is indifferent or hostile to the historical world and its inherent constraints, and to idiosyncratic custom and culture—to the world of the olive tree in the reductive term of New York Times columnist Thomas Friedman. As the writer David Rieff has said (in World Policy Journal) of the Utopian conception of globalization, “purportedly hard-headed and optimistic..., it is, however unconsciously, callous, ignorant, complacent, nationalistic and contemptuous of other cultures and other philosophical traditions.” Gray describes it as a legitimate successor to that other secular Utopian project, the Marxist version of dialectical materialism. It rests intellectually upon an impregnable parochialism and a bias against the past. 

 

The utopianism of globalization has been essential in its appeal to Americans. Even among Democrats and liberals, there was remarkably little controversy over the Clinton administration’s adoption of a policy of international economic deregulation at the very beginning of its first term. Globalization was recognized as an expression of economic values and practices that had become taken for granted in American life at least since the Reagan years, in striking contrast to the values of previous Democratic and liberal Republican administrations (including that of Richard Nixon, who in domestic social policy was a progressive). 

The new ideas held that augmenting the stockholder value of business corporations should be the ruling objective in the conduct of business, with the claims of the public, and of the other actors in the corporate equation—employees and community “stakeholders”—subordinated to those of the stockholders. Under this business doctrine, which has prevailed in the business schools of the United States since the 1970s (and might be said to have achieved an apocalyptic reductio ad absurdum in the Enron corporation), Western economies have on average grown very rapidly. Critics have observed, of course, that this growth has mainly benefited the already rich countries, where neoliberal ideology has dominated social choices and social and employment policy at the price of wage stagnation, the need for multiple wage-earning households, and actual destitution on the one hand combined with enrichment of economic elites on the other. 

Growth has mainly benefited the already rich countries, where neoliberal ideology has dominated social choices and social and employment policy at the price of wage stagnation, the need for multiple wage-earning households, and actual destitution.

The influence of this doctrine on the non-Western societies drawn into the globalized economy has been to reproduce or enlarge social and economic divisions, and to substitute internationalized business and employment norms and production for export markets for traditional social choices, artisanal enterprises, and self-sufficiency farming. The neoliberal argument contends, of course, that all this follows from market dictates, and that the harm it does in the short run will eventually produce benefit for all—the moral (or immoral) rationalization of all past Utopian projects. 

 

With the 1975 defeat in Vietnam, American foreign policy entered a new epoch, which September 11, 2001 would seem to have ended. After Vietnam, military interventions were considered a threat to domestic political stability and to the good order of American armies; hence the doctrine that emerged, eventually called the Powell Doctrine, held that the only feasible interventions were those in which the United States deployed overwhelming force and had a convincing exit strategy. The military record, nonetheless, did not, on the whole, prove a positive one. 

The Balkan interventions offered the best ratio of positive to negative consequences, but were reluctantly undertaken and their outcomes remain fragile even today. No one would argue that the 1995 Dayton Accords for Bosnia, the improvised international trusteeship that now governs Kosovo, whose permanent status remains unresolved, and the fragile coalition of political forces recently achieved in Macedonia, are enduring solutions. Elsewhere, in Lebanon, Panama, Nicaragua, and Somalia, as well as during the Gulf War and the quasi-war that followed against Iraq, American military interventions, whether overt or clandestine, usually left behind worse long-term conditions than had existed before. The consequences were and are often damaging to the United States itself. 

The sources of Islamic terrorism lie partially in the Israeli repression of the Palestinians, which the United States has indirectly supported. They also include the interventionist roles assumed by the United States in Iran under the Shah, and in Saudi Arabia, an alliance with another fragile and repressive monarchy. In Iran and Saudi Arabia, policy rested on the belief that intimate involvement of the United States in the decisions of these governments would be a liberalizing political force. It also mistakenly assumed that promoting a major military role for Iran as a regional American auxiliary, and building a permanent American base structure in the Persian Gulf and inside Saudi Arabia itself, would promote and defend Western values rather than undermine them. 

The crisis in Afghanistan began with an ostensibly Communist coup d’état in 1978, led by a member of the traditional Afghan elite. The Soviet military intervention that followed in support of this new regime brought on an expedient Washington policy of support for international Islamic militancy, until then a negligible political force. We know the consequences for the Soviet Union, for Afghanistan, Pakistan, and the United States, and potentially for Saudi Arabia and other Islamic states in the future. 

Before September 11, nearly all Americans took for granted that the nation’s involvement in global affairs had a constructive overall effect. A division nonetheless had opened in recent years in the policy community, as figures on both left and right of the political party division formulated a new version of the old national messianism. They proposed a deliberate program to exploit the nation’s post-1990 supreme power in order to establish a new version of that benevolent world order first proposed to Americans by Woodrow Wilson, who said that God had created the United States “to show the way to the nations of the world how they shall walk in the paths of liberty.” This “New Wilsonianism” was held not only to be a defense of the United States against the alleged menace of rogue states and terrorism, or of a resurgent China or Russia (the assumed threats of a decade ago), but as the creation of a benevolent global hegemony in which Americans could find lasting security. Other nations, it was said, would accept it because they know that they have “little to fear or distrust from a righteous [America]” (as one advocate of the idea, Joshua Muravchik, ingenuously wrote in 1996). President Bush has more recently said, “We know how good we are!” 

A minority in the American policy community has opposed this hegemonic program, seeing in it that folly of grandeur, and the same self-intoxicated national messianism, about which the past offers eloquent and cautionary lessons. At the same time, the American electorate until recently had also seemed, on the whole, unreceptive to the idea that the United States should attempt to impose its leadership on international society. That majority seems now to have been overruled. 

 

Contemporary history is generally assumed by American policymakers to be a natural if troubled progression from international disorder toward enlightened order, hence fundamentally congenial to the United States as the embodiment of progressive forces. Historical pessimism—a decidedly minority position among Americans—would argue the contrary, that current American policies and interventions, meant to replace the governing regimes of so-called rogue states, or “failed states,” with friendly governments, and to tighten American alliances and reinforce U.S. authority inside the alliances, extending American military power through systems of regional commands and close association with the military forces of client countries, have tended to create disorder rather than order. 

A minority in the American policy community has opposed this hegemonic program, seeing in it that folly of grandeur, and the same self-intoxicated national messianism, about which the past offers eloquent and cautionary lessons.

The United States, simply by being the sole superpower and the most powerful national economy, undermines the established order, which is not a system of unity, and which naturally resists hegemonic power, as has consistently been the case historically. An imbalance of international power has “usually provoked wars,” as the Israeli statesman Abba Eban has observed, and “has never consolidated peace.” Yet these American interventions that generate instability and conflict continue to be carried out in the belief that they actually promote stability, progress, democracy, growth and development, and humane social standards: hence when they have negative consequences, efforts must be redoubled. The destiny of other nations, it is held, eventually will converge with American destiny. 

International society now is confronted with a paradox. Its most powerful member, the United States, conceiving of itself as the model of modern civilization, responsible for international order and progress, practices economic and military-political policies that are inherently or even deliberately destructive of central elements in the existing apparatus of international law and arms control, and the existing norms of international cooperation and order, which it condemns as largely outmoded, if not hostile to American national interests. It does so with mounting emphasis on military solutions and diminishing attention to international precedent and opinion. Even among its allies, this stance has provoked uneasiness, even fear of the unpredictability of American actions, and of their ruthlessness (demonstrated currently by the exclusion of foreign prisoners held in the United States for immigration irregularities from the common law protection afforded by writs of habeas corpus; the apparent determination to incarcerate permanently some uncharged prisoners outside the territorial United States—or until the war against terrorism “is over”; and the illegal transfer of still others to national jurisdictions where torture is tolerated). 

The paradox is unlikely to be resolved without an eventual crisis in America’s relationship with international society. That will necessarily throw into question the nation’s own understanding of the meaning of the American national experiment. That will have unforeseeable consequences. 

 

Albert Camus wrote in 1958, concerning France’s war in his native Algeria, that the role of the intellectual is “to clarify definitions in order to disintoxicate minds and to calm fanaticisms.” The United States is not at war with “evil,” a moral or metaphysical reality. It is at war with a limited and self-motivated group of individuals, possessing limited resources, who employ terrorism against the United States for mixed political and religious reasons. Their principal religious motivation, that in their view the United States is responsible for an assault on the values of their society, by globally propagating a systematic materialism and a nihilistic and narcissistic hedonism, are accusations that might equally be made by readers of this magazine. 

The enemy consists of such individuals, and of several weak states under dictatorial or oligarchic rule, pursuing nationalistic, ideological, or religious agendas hostile to the United States. They are conceivably capable of manufacturing or acquiring weapons of mass destruction in order to deter American attack against them. In the American strategic community it is taken for granted that such weapons have no offensive utility against the United States. While individuals may become inspired by religious or ideological fanaticisms to commit suicide in order to harm America (or Israel, America’s ally), governments are collectively ruled and administered by self-interested persons. Even the seeming nihilism of Hitler had a plan behind it. 

The first obligation on the citizen is to disintoxicate the American debate. The second is to reduce the unprecedented influence of the military in Washington, not because military thinking is intrinsically objectionable—it is not—but because the Pentagon is now the most important bureaucratic actor in Washington, and in combination with the defense and aerospace industries, Washington’s most important lobby. It exercises overwhelming influence on administrations and Congress to look for military solutions to nonmilitary problems. This is very dangerous to the United States itself, as well as to international society. 

The third obligation is to rescue American government from money, whose influence has transformed American democracy (or the American representative republic, to be exact) into a grossly unrepresentative plutocracy. The role of money in American government has always been very large, but since advertising on commercial television became the dominant medium of American political communication, and since the 1976 Supreme Court ruling (Buckley v. Valeo) that held that spending money on political advertising is a constitutionally protected form of free speech, a means test has been imposed on political candidates. Corporate interest has become by far the most important influence on foreign and domestic policy, all but eliminating the influence, or even the widespread articulation, of the concept of general or public interest in our national affairs. It remains to be seen the effect of recently signed legislation on campaign finance, and whether even these strictures will be supported in court. 

The final obligation is to comprehend that American nationalism, wedded to American messianism, has currently acquired overpowering force in American life, in that it drives a military program of total military domination everywhere, among allies and neutrals as well as enemies, and a political program of suppressing any resistance to perceived American interest in any matter at all, whatever the cost to allied interests, international community, or international law or precedent. Behind this seems to lie what I would describe as an unarticulated, unintended, yet culpable denial that any sovereign interest exists, beyond American interest—which is an implicit blasphemy. 

William Pfaff, a former editor of Commonweal, was a political columnist for the International Herald Tribune in Paris and author of The Irony of Manifest Destiny: The Tragedy of America's Foreign Policy (Walker & Company).

Also by this author
Published in the September 2024 issue: View Contents

Most Recent

© 2024 Commonweal Magazine. All rights reserved. Design by Point Five. Site by Deck Fifty.