In the second year of the U.S. occupation of Iraq many people in the U.S. still cling to a political tradition that confuses actually existing American society “with the ideal society that would fulfill human destiny.”1 They tend to think of the United States not as the polyarchy and global empire that it is, but as the incarnation of “freedom and democracy,” or at least the closest approximation to the democratic ideal that exists. Whatever their assessment of current U.S. foreign policy, they regard their country as the Promised Land, the embodiment of Western virtue, the deliverer of freedom to oppressed peoples.
Many see it, too, as the only national state that wages perpetual war for the global good. From starting a war to setting aside the prohibitions of international law and morality, the U.S. is entitled to do, beyond its borders, what it wants when it wants, provided the action can be justified in utilitarian terms of saving American lives and the U.S. Congress goes along with it.2
Whether we call this absolute veneration of “America” national essentialism or millennialism, whether we see it as the outlook of a superpower or the prerogative of a self-designated Chosen People, at its root lies “the belief that [American] history, under divine guidance, will bring about the triumph of Christian principles” and eventually the emergence of “a holy utopia.”3 Such faith in the unique moral destiny of the United States may be held independently of Christian beliefs. Its historical origins, however, trace back to colonial New England, and beyond that to the Bible; and it is omnipresent in every part of the country, even though its strongest regional base presently lies in the South and West.
Long before the birth of the Republic, ideas of chosenness have been at the heart of a complicated ideology of rule that has resonated powerfully in American society.4 Both the Puritan Calvinists of the Massachusetts Bay Colony and the Protestant millenarians of the early 19th century conceived of the United States as an exceptional nation, chosen by God to be the acme of freedom and to redeem humankind. As historian Ernest Tuveson observed during the Vietnam War-era, the idea of the “redeemer nation” through which God operates is also the foundation of the notion of continuous warfare between ‘good’ and ‘evil’ people.5 Virtually every politician who exploits the religious emotions of people in the U.S. for the purpose of waging war draws on these ideas and images, embodied in religious and secular texts.
Today no single millenarian ideology exists, but rather a spectrum of religious and secular thought in which biblical ideas of a “conquering Chosen People” and visions of the United States as God’s model of the world’s future appear prominently.6 Just as in the past, these ideas link directly to the apocalyptic “defining moment,” in which a small group of leaders at the top of society summon the people to fulfill some sacred mission of redemption, or to play a new global role for the sake of humanity.7
Usually, the decisive moments occur when the president announces the mission or proclaims the godly mandate, regardless of whether the community is actually under threat. At such times, secular and religious millenarianism can generate support for policies of imperialism and war, or for advancing democratic ideals in the process of overcoming enemies.
In the 18th and 19th centuries, politicians repeatedly used different forms of this messianic national faith to justify killing Indians and acquiring their land, conquering Mexicans, and taking over the continent. In the 20th century they used it to establish a foothold in Cuba, take control of Puerto Rico, colonize the Philippines, overcome “isolationism,” and construct a global empire of a new kind.
Economic greed, racial superiority, the blind ambition of leaders, and their desire to dominate other lands and peoples remained their own justifications for killing, but invariably the civil religion concealed these baser motives. Through over two-hundred years of expansion, belief in Americans as the Chosen People, morally superior to others, has reigned, enabling U.S. political leaders to repeatedly wage war more or less at will. That same belief in Americans as the Chosen People and the U.S. nation-state as God’s “redeemer nation” (Tuveson) is the basis for their intense righteousness in threatening others, yet never “allow[ing] others to call them to account.”8
For the past four years President George W. Bush has followed a line of chief executives who, for reasons of power and dominion, harkened back to the Old Testament theme of the Chosen People. But few earlier presidents made a Zen-like claim to “moral clarity” their guide for policy, or acted on the world scene with such open contempt for international law and democracy. Bush and his top ideologues have carried religious Manicheanism and the powers of the imperial presidency to new levels. In the process, they have not only violated international law but trampled on the U.S. Constitution, and turned America‘s procedural democracy in a more authoritarian, repressive direction.9
Neither religious conviction nor bigotry drove them to these acts. But for reasons for domestic politics and their (Congressionally unsupervised) control of huge military forces, Bush and his cohorts chose to do them while posturing about God, American values, and the unique American mission to lead the world. Right after a group of radical Islamic killers attacked the United States, Bush went out of his way to make gestures of tolerance toward “good” (non-Christian hating) Muslims and to deny that his “war on terrorism” was a crusade or a holy war. These acts were designed to allay fear in the Muslim community while insulating him from liability based on the speech and actions of subordinates who would have de facto authority for actually waging the holy crusade. Bush’s public posturing right after 9/11, in short, illustrated the double message that his administration sent out for the remainder of his term: formally endorse one set of rules, values, and policies for the record; secretly establish different norms, values, and policies for daily operations.
In January 2001, in his first inaugural address, Bush introduced the theme of the U.S. taking on an “axis of evil,” and suggested that God operated through the people of the U.S. to achieve His purpose.10 Although Bush was using the expressions of his speech writer, they overlapped with his own sense of the world divided between warring powers of absolute good and evil.
At West Point, on June 1, 2002, eight months after the start of his first war, in Afghanistan, Bush increased the targets of his “war on terrorism” to “sixty or more countries,” and declared that “moral clarity” was essential to our victory in the Cold War” and now, once again, “We are in a conflict between good and evil, and America will . . . lead the world in opposing” “evil and lawless regimes.”11 “[O]ur security will require transforming the military. . . that must be ready to strike at a moment’s notice in any dark corner of the world. And our Security will require all Americans to be . . . ready for preemptive action when necessary to defend our liberty and to defend our lives.” He went on to note that this would mean maintaining “military strengths beyond challenge, thereby making the destabilizing arms races of other eras pointless, and limiting rivalries to trade and other pursuits of peace.”
These lines, from the strategy articulated in 1992 by deputy undersecretary of defense Paul Wolfowitz, would soon be enshrined in the administration’s “National Security Strategy” (September 2002). The latter document made “preventive war” official state doctrine by proclaiming to the world that the U.S. would, whenever and wherever it chose, act unconstrained by international law. U.S. administrations had long been doing that without blatant public declaration. Bush, however, unabashedly announced that the U.S. no longer had need to genuflect to international law and morality, or even make excuses for its exercise of hegemony.
Two years later White House, Justice Department, and Pentagon lawyers informed him that he could authorize his underlings to order the use of torture during the interrogation of prisoners or detainees under American control, something they were already doing in Afghanistan and Guantanamo Bay, Cuba.12 Shortly afterward, on March 19-20, 2003, Bush started his second colonial war, attacking without provocation the sovereign state of Iraq, which had already been crucially weakened through a decade of the harshest UN economic sanctions ever mandated and posed no threat to any state, let alone the U.S. The war, launched in the teeth of strong opposition at home and historically unprecedented worldwide protests, was in clear violation of the UN and Nuremberg Charters and the U.S. Constitution, which gives no president or congress the power to wage “anticipatory” or “preventive” war absent real, imminent threat.
After easily overthrowing the Baathist government in Baghdad and destroying the Iraqi state, the American conquerors failed to find nuclear, chemical, or biological weapons. The administration had created a phony Iraq “threat,” then used its millenarian creed in order to justify fighting an immoral, illegal war to eliminate it. The major newspaper and broadcast media, more anxious to serve the state rather than the public, eagerly went along, highlighting the lies that the Bush administration wanted emphasized. Many citizens, conditioned to imagine themselves part of the “redeemer nation,” supported “Operation Iraqi Freedom” as part of a “war on terror.”
One year after Bush staged his “mission accomplished,” victory-photo opportunity on the deck of the USS Abraham Lincoln (May 1, 2003), the nationalist resistance of the Iraqi people would stretch the U.S. military to its limit and frustrate American expectations. Rather than putting an end to terrorism, the Iraq war of Bush and British Prime Minister Tony Blair spread the terrorist threat and made their citizens objects of hatred, revulsion, and reprisal throughout the Middle East.
In the course of waging the “war against terror” American military forces committed (and continue to commit) large-scale, systemic human rights abuses against Muslims that qualify, under Nuremberg principles and later international treaties, as “crimes against humanity.”13 From Afghanistan to Iraq they have directly attacked and brutalized civilian populations, and imposed upon them collective punishments.
From the prison cages of Guantanamo to Baghram air base near Kabul, and an unknown number of secret detention facilities in Afghanistan and elsewhere, American military, CIA, and civilian contractors have subjected thousands of helpless prisoners of war to “Rumsfeld Processing.” Chief among its features are hooding, beating, sexual humiliation, sleep deprivation, standing naked for long periods of time, mental abuse, the use of dogs to intimidate, and other forms of stress, designed to make them act against their will or conscience.14 At Abu Ghraib prison near Baghdad and a dozen other Iraqi detention facilities (Al Qaim, Al Asad, Mosul, Tikrit, Umm Qasr, etc.), the charges steadily mount: murder, rape, the sodomization of children, violent beating, and theft of property on a large but unknown scale; widespread, officially-ordered infliction of torture; cruel, degrading treatment of prisoners of war and “security detainees” of all ages, most of them innocent of any crime. Moreover, the Bush administration continues to authorize the hiding from the scrutiny of the International Red Cross of prisoners and detainees, held throughout the U.S. planetary gulag.
Up and down the military and civilian chains of command, in the upper echelons of the Pentagon and on the ground, the evidence accumulates of stonewalling and lying to prevent the disclosure of incriminating facts, professional negligence, malfeasance, misfeasance, incompetence, and dereliction of duty. The evidence reveals not only that a minority of individual U.S. soldiers are “rogues” or “rotten apples” because they committed crimes. Rather, under the leadership of the Bush administration, entire organizational subcultures within the White House, Pentagon, CIA, and Justice Department have become mired in criminality.
Overwhelming evidence suggests, further, that the American state has been guilty of massive, repeated violations of customary international law, treaty law, and federal statute. Specifically, the U.S. bears responsibility to the international community for having violated: the 1949 Geneva Conventions; the 1984 UN Convention Against Torture and Other Cruel, Inhuman or Degrading Treatment or Punishment; the Statutes of the UN international criminal tribunals for the former Yugoslavia and Rwanda; Article 7 (1) of the 1998 International Criminal Court Statute; and the 1994 Torture Convention Implementation Act (18 U.S.C. 2340 A). All six laws criminalize torture.
Many high-level members of the Bush administration and officials working in the “Office of the Secretary of Defense” were involved in these grave breaches of law and their cover-up. But under the doctrine of direct and imputed command responsibility the heaviest individual culpability accrues to commander-in-chief Bush and Secretary of Defense Rumsfeld. Both acted on the premise that the end (intelligence) justifies the means (torture). Bush, whose “razor-sharp distinction of the ‘good guys’ and the ‘bad guys’ . . . filtered down the ranks,” bears primary responsibility for issuing the orders and creating the ethical climate that condoned the torture of detainees.15 Rumsfeld approved not only the criminal policies establishing the U.S. global torture system but also some of the actual techniques used by lower-ranking military and CIA personnel to inflict pain.16 Their offenses cry out for criminal prosecution and appropriate punishment.
ZNetwork is funded solely through the generosity of its readers.
Donate