A report card on the American Project

Thirty years later, perhaps it’s time to assess just how well the United States has fulfilled the expectations President Bush articulated in 1990.

416
SOURCETomDispatch
Image Credit: Nktwentythree/Dreamstime

Thirty years ago this month, President George H.W. Bush appeared before a joint session of Congress to deliver his first State of the Union Address, the first post-Cold War observance of this annual ritual. Just weeks before, the Berlin Wall had fallen. That event, the president declared, “marks the beginning of a new era in the world’s affairs.” The Cold War, that “long twilight struggle” (as President John F. Kennedy so famously described it), had just come to an abrupt end. A new day was dawning. President Bush seized the opportunity to explain just what that dawning signified.

“There are singular moments in history, dates that divide all that goes before from all that comes after,” the president said. The end of World War II had been just such a moment. In the decades that followed, 1945 provided “the common frame of reference, the compass points of the postwar era we’ve relied upon to understand ourselves.” Yet the hopeful developments of the year just concluded—Bush referred to them collectively as “the Revolution of ’89”—had initiated “a new era in the world’s affairs.”

While many things were certain to change, the president felt sure that one element of continuity would persist: the United States would determine history’s onward course. “America, not just the nation but an idea,” he emphasized, is and was sure to remain “alive in the minds of people everywhere.”

“As this new world takes shape, America stands at the center of a widening circle of freedom—today, tomorrow, and into the next century. Our nation is the enduring dream of every immigrant who ever set foot on these shores and the millions still struggling to be free. This nation, this idea called America, was and always will be a new world — our new world.” 

Bush had never shown himself to be a particularly original or imaginative thinker. Even so, during a long career in public service, he had at least mastered the art of packaging sentiments deemed appropriate for just about any occasion. The imagery he employed in this instance—America occupying the center of freedom’s widening circle—did not stake out a new claim devised for fresh circumstances. That history centered on what Americans professed or did expressed a hallowed proposition, one with which his listeners were both familiar and comfortable. Indeed, Bush’s description of America as a perpetually self-renewing enterprise engaged in perfecting freedom summarized the essence of the nation’s self-assigned purpose.

In his remarks to Congress, the president was asserting a prerogative that his predecessors had long ago appropriated: interpreting the zeitgeist in such a way as to merge past, present, and future into a seamless, self-congratulatory, and reassuring narrative of American power. He was describing history precisely as Americans—or at least privileged Americans—wished to see it. He was, in other words, speaking a language in which he was fluent: the idiom of the ruling class. 

As the year 1990 began, duty—destiny, even—was summoning members of that ruling class to lead not just this country, but the planet itself and not just for a decade or two, or even for an “era,” but forever and a day. In January 1990, the way ahead for the last superpower on planet Earth—the Soviet Union would officially implode in 1991 but its fate already seemed obvious enough—was clear indeed.

So, how’d we do?

Thirty years later, perhaps it’s time to assess just how well the United States has fulfilled the expectations President Bush articulated in 1990. Personally, I would rate the results somewhere between deeply disappointing and flat-out abysmal. 

Bush’s “circle of freedom” invoked a planet divided between the free and the unfree. During the Cold War, this distinction had proven useful even if it was never particularly accurate. Today, it retains no value whatsoever as a description of the actually existing world, even though in Washington it persists, as does the conviction that the U.S. has a unique responsibility to expand that circle.

Encouraged by ambitious politicians and ideologically driven commentators, many (though not all) Americans bought into a militarized, Manichean, vastly oversimplified conception of the Cold War. Having misconstrued its meaning, they misconstrued the implications of its passing, leaving them ill-prepared to see through the claptrap in President Bush’s 1990 State of the Union Address. 

Bush depicted the “Revolution of ‘89” as a transformative moment in world history. In fact, the legacy of that moment has proven far more modest than he imagined. As a turning point in the history of the modern world, the end of the Cold War ranks slightly above the invention of the machine gun (1884), but well below the fall of Russia’s Romanov dynasty (1917) or the discovery of penicillin (1928). Among the factors shaping the world in which we now live, the outcome of the Cold War barely registers.

Fairness obliges me to acknowledge two exceptions to that broad claim, one pertaining to Europe and the other to the United States. 

First, the end of the Cold War led almost immediately to a Europe made “whole and free” thanks to the collapse of the Soviet empire. Yet while Poles, Lithuanians, the former citizens of the German Democratic Republic, and other Eastern Europeans are certainly better off today than they were under the Kremlin’s boot, Europe itself plays a significantly diminished role in world affairs. In healing its divisions, it shrank, losing political clout. Meanwhile, in very short order, new cleavages erupted in the Balkans, Spain, and even the United Kingdom, with the emergence of a populist right calling into question Europe’s assumed commitment to multicultural liberalism.

In many respects, the Cold War began as an argument over who would determine Europe’s destiny. In 1989, our side won that argument. Yet, by then, the payoff to which the United States laid claim had largely been depleted. Europe’s traditional great powers were no longer especially great. After several centuries in which global politics had centered on that continent, Europe had suddenly slipped to the periphery. In practice, “whole and free” turned out to mean “preoccupied and anemic,” with Europeans now engaging in their own acts of folly. Three decades after the “Revolution of ’89,” Europe remains an attractive tourist destination. Yet, from a geopolitical perspective, the action has long since moved elsewhere.

The second exception to the Cold War’s less than momentous results relates to U.S. attitudes toward military power. For the first time in its history, the onset of the Cold War had prompted the United States to create and maintain a powerful peacetime military establishment. The principal mission of that military was to defend, deter, and contain. While it would fight bitter wars in Korea and Vietnam, its advertised aim was to avert armed conflicts or, at least, keep them from getting out of hand. In that spirit, the billboard at the entrance to the headquarters of the Strategic Air Command, the Pentagon’s principal Cold War nuclear strike force (which possessed the means to extinguish humankind), reassuringly announced that “peace is our profession.”

When the Cold War ended, however, despite the absence of any real threats to U.S. security, Washington policymakers decided to maintain the mightiest armed forces on the planet in perpetuity. Negligible debate preceded this decision, which even today remains minimally controversial. That the United States should retain military capabilities far greater than those of any other nation or even combination of numerous other nations seemed eminently sensible.

In appearance or configuration, the post-Cold War military differed little from what it had looked like between the 1950s and 1989. Yet the armed forces of the United States now took on a radically different, far more ambitious mission: to impose order and spread American values globally, while eliminating obstacles deemed to impede those efforts. During the Cold War, policymakers had placed a premium on holding U.S. forces in readiness. Now, the idea was to put “the troops” to work. Power projection became the name of the game. 

Just a month prior to his State of the Union Address, President Bush himself had given this approach a test run, ordering U.S. forces to intervene in Panama, overthrow the existing government there, and install in its place one expected to be more compliant. The president now neatly summarized the outcome of that action in three crisp sentences. “One year ago,” he announced, “the people of Panama lived in fear, under the thumb of a dictator. Today democracy is restored; Panama is free. Operation Just Cause has achieved its objective.” 

Mission accomplished: end of story. Here, it seemed, was a template for further application globally.

As it happened, however, Operation Just Cause proved to be the exception rather than the rule. Intervention in Panama did inaugurate a period of unprecedented American military activism. In the years that followed, U.S. forces invaded, occupied, bombed, or raided an astonishing array of countries. Rarely, however, was the outcome as tidy as it had been in Panama, where the fighting lasted a mere five days. Untidy and protracted conflicts proved more typical of the post-Cold War U.S. experience, with the Afghanistan War, a futile undertaking now in its 19th year, a notable example. The present-day U.S. military qualifies by any measure as highly professional, much more so than its Cold War predecessor. Yet the purpose of today’s professionals is not to preserve peace but to fight unending wars in distant places.

Intoxicated by a post-Cold War belief in its own omnipotence, the United States allowed itself to be drawn into a long series of armed conflicts, almost all of them yielding unintended consequences and imposing greater than anticipated costs. Since the end of the Cold War, U.S. forces have destroyed many targets and killed many people. Only rarely, however, have they succeeded in accomplishing their assigned political purposes. From a military perspective — except perhaps in the eyes of the military-industrial complex — the legacy of the “Revolution of ‘89” turned out to be almost entirely negative.

A broken compass

So, contrary to President Bush’s prediction, the fall of the Berlin Wall did not inaugurate a “new era in world affairs” governed by “this idea called America.” It did, however, accelerate Europe’s drift toward geopolitical insignificance and induced in Washington a sharp turn toward reckless militarism—neither of which qualifies as cause for celebration. 

Yet today, 30 years after Bush’s 1990 State of the Union, a “new era of world affairs” is indeed upon us, even if it bears scant resemblance to the order Bush expectedto emerge. If his “idea called America” did not shape the contours of this new age, then what has? 

Answer: all the things post-Cold War Washington policy elites misunderstood or relegated to the status of afterthought. Here are three examples of key factors that actually shaped the present era. Notably, each had its point of origin prior to the end of the Cold War. Each came to maturity while U.S. policymakers, hypnotized by the “Revolution of ’89,” were busily trying to reap the benefits they fancied to be this country’s for the taking. Each far surpasses in significance the fall of the Berlin Wall.

The “Rise” of China: The China that we know today emerged from reforms instituted by Communist Party leader Deng Xiaoping, which transformed the People’s Republic into an economic powerhouse. No nation in history, including the United States, has ever come close to matching China’s spectacular ascent. In just three decades, its per capita gross domestic product skyrocketed from $156 in 1978 to $9,771 in 2017.

The post-Cold War assumption common among American elites that economic development would necessarily prompt political liberalization turned out to be wishful thinking. In Beijing today, the Communist Party remains firmly in control. Meanwhile, as illustrated by its “Belt and Road” initiative, China has begun to assert itself globally, while simultaneously enhancing the capabilities of the People’s Liberation Army. In all of this, the United States—apart from borrowing from China to pay for an abundance of its imported products (now well over a half-trillion dollars of them annually)—has figured as little more than a bystander. As China radically alters the balance of power in twenty-first-century East Asia, the outcome of the Cold War has no more relevance than does Napoleon’s late-eighteenth-century expedition to Egypt. 

A Resurgence of Religious Extremism: Like the poor, religious fanatics will always be with us. They come in all stripes: Christians, Hindus, Jews, Muslims. Yet implicit in the American idea that lay at the heart of Bush’s State of the Union Address was an expectation of modernity removing religion from politics. That the global advance of secularization would lead to the privatization of faith was accepted as a given in elite circles. After all, the end of the Cold War ostensibly left little to fight about. With the collapse of communism and the triumph of democratic capitalism, all the really big questions had been settled. That religiously inspired political violence would become a crucial factor in global politics therefore seemed inconceivable.

Yet a full decade before the “Revolution of ’89,” events were already shredding that expectation. In November 1979, radical Islamists shocked the House of Saud by seizing the Grand Mosque in Mecca. Although local security forces regained control after a bloody gun battle, the Saudi royal family resolved to prevent any recurrence of such a disaster by demonstrating beyond the shadow of a doubt its own fealty to the teachings of Allah. It did so by expending staggering sums throughout the Ummah to promote a puritanical form of Islam known as Wahhabism.

In effect, Saudi Arabia became the principal underwriter of what would morph into Islamist terror. For Osama bin Laden and his militant followers, the American idea to which President Bush paid tribute that January in 1990 was blasphemous, intolerable, and a justification for war. Lulled by a belief that the end of the Cold War had yielded a definitive victory, the entire U.S. national security apparatus would be caught unawares in September 2001 when religious warriors assaulted New York and Washington. Nor was the political establishment prepared for the appearance of violence perpetrated by domestic religious extremists. During the Cold War, it had become fashionable to declare God dead. That verdict turned out to be premature.

The Assault on Nature: From its inception, the American idea so lavishly praised by President Bush in 1990 had allowed, even fostered, the exploitation of the natural world based on a belief in Planet Earth’s infinite capacity to absorb punishment. During the Cold War, critics like Rachel Carson, author of the pioneering environmental book Silent Spring, had warned against just such an assumption. While their warnings received respectful hearings, they elicited only modest corrective action. 

Then, in 1988, a year prior to the fall of the Berlin Wall, in testimony before Congress, NASA scientist James Hansen issued a far more alarming warning: human activity, particularly the burning of fossil fuels, was inducing profound changes in the global climate with potentially catastrophic consequences. (Of course, a prestigious scientific advisory committee had offered just such a warning to President Lyndon Johnson more than two decades earlier, predicting the early twenty-first-century effects of climate change, to no effect whatsoever.)

To put it mildly, President Bush and other members of the political establishment did not welcome Hansen’s analysis. After all, to take him seriously meant admitting to the necessity of modifying a way of life centered on self-indulgence, rather than self-restraint. At some level, perpetuating the American penchant for material consumption and personal mobility had described the ultimate purpose of the Cold War. Bush could no more tell Americans to settle for less than he could imagine a world order in which the United States no longer occupied “the center of a widening circle of freedom.”

Some things were sacrosanct. As he put it on another occasion, “The American way of life is not up for negotiations. Period.”

So while President Bush was not an outright climate-change denier, he temporized. Talk took precedence over action. He thereby set a pattern to which his successors would adhere, at least until the Trump years. To thwart communism during the Cold War, Americans might have been willing to “pay any price, bear any burden.” Not so when it came to climate change. The Cold War itself had seemingly exhausted the nation’s capacity for collective sacrifice. So, on several fronts, the assault on nature continues and is even gaining greater momentum.

In sum, from our present vantage point, it becomes apparent that the “Revolution of ‘89” did not initiate a new era of history. At most, the events of that year fostered various unhelpful illusions that impeded our capacity to recognize and respond to the forces of change that actually matter.

Restoring the American compass to working order won’t occur until we recognize those illusions for what they are. Step one might be to revise what “this idea called America” truly signifies.

FALL FUNDRAISER

If you liked this article, please donate $5 to keep NationofChange online through November.

[give_form id="735829"]

COMMENTS