Landmark Affirmative Action Ruling
Citing the possession of weapons of mass destruction, links to terrorism, and Saddam Hussein's despotism as the casus belli for “regime change” in Iraq, President George W. Bush launched the first preemptive war in United States history on March 20, 2003. For some months, the United States had deferred to the United Nations' handling of Iraq after the passage of Security Council resolution 1441 in Nov. 2002, which initiated new weapons inspections—Iraq had spent the past decade attempting to thwart UN measures to regulate its arsenal. But by March 2003, the president concluded that the UN's patient strategy of containment and deterrence in the face of Iraq's continuing defiance was dangerously fainthearted. In the war against terrorism—and Bush had controversially identified Iraq as the next target in that battle—the United States could no longer stand by while a potential threat to its security grew into an actual one. The new U.S. defense doctrine, announced by the president in a June 2002 speech at West Point and codified in The National Security Strategy of the United States (Sept. 2002), expanded the justifications for war: “Legal scholars…often conditioned the legitimacy of preemption on the existence of an imminent threat—most often a visible mobilization of armies.” But a non-conventional war against terrorism, the document argued, requires “taking anticipatory action to defend ourselves, even if uncertainty remains as to the time and place of the enemy's attack.” Many expressed alarm at this aggressive shift in U.S. policy—an “international hunting license” is how one critic put it. The doctrine was strongly criticized by the UN and a number of world leaders, particularly France, Germany, and Russia, who were apprehensive about what they saw as the U.S.'s circumvention of international law and its disregard for international consensus. The doctrine declared that the country “will not hesitate to act alone, if necessary, to exercise our right of self-defense by acting preemptively against…terrorists.”
Going It Alone
In place of a UN mandate for war (the U.S. and Britain, after intensive lobbying, gained only two supporters among the 15-member Security Council: Spain and Bulgaria), the United States gathered a “coalition of the willing.” Although the group of 63 nations was little more than decorative, it allowed Secretary of Defense Donald Rumsfeld to claim that “it is larger than the coalition that existed during the Gulf War in 1991.” The differences in genuine international support between the two Iraq wars could not have been more profound. In 2003, besides the U.S.'s 225,000 troops and Britain's 45,000, only Australia and Poland contributed combat soldiers (2,000 and 200, respectively), whereas the UN-sponsored coalition in the 1991 Gulf War included combat troops from 32 countries. In 2003, the U.S. assumed the bulk of the war's cost, amounting to $4 billion a month; in 1991, the U.S. share of the war's expenses was just 10%.
Operation Iraqi Freedom
Along with its steadfast ally, Britain, the U.S. launched Operation Iraqi Freedom on March 20, beginning with an unsuccessful “decapitation attack” meant to eliminate Saddam Hussein and short-circuit the war. The U.S. military's promise of a campaign of “shock and awe” was in reality far more muted, but by April 9, despite meeting greater-than-expected resistance, U.S. forces took control of Baghdad, signalling the collapse of Saddam Hussein's regime. Major fighting was declared over by May 1. The United States sustained a total of 138 casualties, 116 of those from combat, between March 20 and May 1. By Dec. 15, a total of 455 U.S. soldiers had died: 313 in combat and 142 from non-hostile causes. A total of 2,249 soldiers were injured in combat; 361 were injured in non-hostile action. Unofficial figures of Iraqi civilian casualties during the war (from www.iraqbodycount.net) through Dec. 6 estimated that between 7,935 and 9,766 died. Numerous high-ranking Baathists, including Hussein's two sons (Qusay and Uday), were killed or captured in the months following the war. Finally, after eight months of searching, the U.S. military captured Saddam Hussein on Dec. 13. The deposed leader was found hiding in a hole near his hometown of Tikrit and surrendered without a fight.
Troubled Aftermath
Post-war reconstruction went far less smoothly than the war itself. After decades of Hussein's repression, economic hardship from years of UN sanctions, and its third war in 20 years, Iraq now found itself enveloped in violence and lawlessness. Many essential services, including electricity and water, had yet to be restored to pre-war levels. Iraqis strongly protested the delay in self-rule and the absence of a timetable to end the U.S. occupation. In July, diplomat Paul Bremer, whom the U.S. designated as the chief administrator of the occupation, appointed a 25-member Iraqi governing council, a first step toward transferring authority to Iraqis.
Coalition forces faced daily attacks; by August, more soldiers had died in the aftermath of the war than during the period of official combat. The U.S. launched several tough military campaigns to subdue the remaining Iraqi resistance, which also had the effect of further alienating the populace. Among the most destructive acts of organized violence were the sabotage of several oil pipelines, the destruction of UN headquarters in Baghdad by a car bomb, which killed top UN envoy Sérgio Vieira de Mello, and the assassinations of one of Iraq's most important Shi'ite leaders and a member of the Iraqi governing council. It is unclear who was responsible for what U.S. forces commander Gen. John P. Abizaid called “a classical guerrilla-type campaign.” Donald Rumsfeld blamed “dead-enders, foreign terrorists, and criminal gangs,” but U.S. intelligence has suggested that the most significant threat is now ordinary Iraqis, bridling under the occupation.
Lawmakers on both sides of the aisle questioned whether more U.S. troops beyond the 130,000 stationed in post-war Iraq should be deployed. In addition to a British force of 11,000, a 9,000-strong international stabilizing force led by Poland began arriving in July 2003 to help alleviate the strain (the U.S. largely footed the bill for the Polish force). In a coolly received speech at the UN in September, Bush asked the international community to provide more troops and money for Iraq, but made it clear that decision-making would remain in U.S. hands. He had already asked Congress for $87 billion on top of the $79 billion Congress approved in April to cover military and reconstruction spending for Iraq ($11 billion was to be set aside for Afghanistan). The solidly pro-war Economist remarked that “since the objective was regime change, not just regime toppling, no triumph can be declared until a durable new regime is in place.”
November was the deadliest month for American soldiers in Iraq: several U.S. helicopters were downed and well-planned car bomb attacks also targeted troops, resulting in the deaths of at least 75 soldiers. The rising death toll prompted a reversal in the Bush administration's Iraq policy. In a deal with Iraqi Governing Council, the U.S. agreed to transfer power to an interim government in July 2004, much earlier than originally planned.
The War's Shifting Justifications
Months of searching for Iraq's weapons of mass destruction—one of the central reasons the Bush and Blair administrations cited for launching the war—yielded no hard evidence, and both administrations and their intelligence agencies came under fire. There were also mounting allegations that the existence of these weapons was exaggerated or distorted to justify the war. In July, the Bush administration conceded that evidence claiming that Iraq was pursuing a nuclear weapons program by seeking to obtain uranium from Africa—cited in the president's State of the Union address and repeated by a number of top administration officials—had been discredited. Two months later Vice President Cheney reluctantly admitted, “we never had any evidence that [Hussein] had acquired a nuclear weapon.”
With his most compelling argument for war still unsubstantiated, Bush emphasized other rationales: Hussein's brutal repression and human rights record, and Iraq as “the central front” in the war against terrorism. According to Deputy Defense Secretary Paul Wolfowitz, a functioning democracy in Iraq would “demonstrate especially to the Arab and Muslim world that there is a better way than the way of the terrorist.”
President Bush's broad, all-purpose definition of terrorism blurred the distinctions between the Sept. 11 attacks, al-Qaeda, and Iraq, creating a widespread impression among the public of their direct connection. Bush's May 1 speech declaring the end of the fighting, for example, claimed that “the battle of Iraq is one victory in a war on terror that began on Sept. 11, 2001.…We've removed an ally of al-Qaeda.” A Washington Post poll just prior to the second anniversary of Sept. 11 revealed that 69% of Americans believed that Saddam Hussein was “personally involved” in the Sept. 11 attacks, an allegation for which Bush himself has acknowledged there is “no evidence.”
A number of supporters of the war argued that murky or flawed pre-war rationales do not undermine the enormous good achieved by the war. Britain's prime minister Tony Blair maintained that “history would forgive” the UK and U.S. “if we are wrong” about weapons of mass destruction—the end to the “inhuman carnage and suffering” caused by Saddam Hussein was justification enough for the war. But opponents argued that if Washington's preemptive war was not based on a real and imminent threat but simply on the perception of one, if the evidence presented was not unimpeachably credible, and if the case for war hinged on fluctuating rationales adjusted after the fact, then the grave decision to launch an invasion becomes so perilously arbitrary and lacking in transparency that it cannot be sanctioned in a democracy. At its best the doctrine of preemption permits, in President Bush's words, “the wisdom and the will to stop great threats before they arrive.” At its worst, the doctrine becomes, in UN Secretary General Kofi Annan's words, “the unilateral and lawless use of force.”
Tax Cuts, the Deficit, and Recovery
On the domestic front, President Bush unveiled a sweeping economic stimulus plan that characteristically centered around tax cuts. The plan in its original form was to cut taxes by $670 billion over ten years; Congress approved a $350 billion version in May (which will in fact rise to a $800 billion tax cut if its sunset clauses are cancelled). The plan strongly favored two groups: two-parent households with several children, and the wealthy—nearly half the proposed tax benefits were reserved for the richest 10% of American taxpayers. Critics argued that it was unsound to offer tax cuts in the midst of a jobless recovery (nearly 3 million jobs had been lost since Bush came to office), when the country was involved in an enormously expensive war, and when the federal budget deficit, according to the nonpartisan Congressional Budget Office, was expected to reach a record $480 billion in 2004. Bush continued to argue that his previous tax cuts (this was his third round) had managed to keep the recession shallow and were beginning to revive the economy. And indeed, the economy began to rebound substantially in the latter part of 2003. GDP grew by a vigorous 7.2% in the third quarter, and in the fourth quarter, unemployment began to drop as productivity increased.
But prospects remained bleak for the poor: the most recent statistics revealed that in 2002, 34.6 million (12% of the population) lived in poverty, up 1.7 million from the year 2001, and the percent of the population without health insurance rose to 15.2%, the largest increase in a decade.
Landmark Affirmative Action Ruling
In a landmark case involving the University of Michigan’s affirmative action policies—one of the most important rulings on the issue in twenty-five years—the Supreme Court decisively upheld the right of affirmative action in higher education. The Court ruled on two cases: the University of Michigan's undergraduate program (Gratz v. Bollinger) and its law school (Grutter v. Bollinger). The Supreme Court (5–4) upheld the University of Michigan Law School's policy, ruling that race can be one of many factors considered by colleges when selecting their students because it “furthers a compelling interest in obtaining the educational benefits that flow from a diverse student body.” But the Court ruled (6–3) that the more formulaic approach of the University of Michigan's undergraduate admissions program, which uses a point system that rates students and awards additional points to minorities, had to be modified since it did not provide the necessary “individualized consideration.”
The Court held that the justification for affirmative action had evolved since its introduction in the 1960s—originally meant to redress past oppression and injustice, it now served to promote a “compelling state interest in diversity” that provided advantages for all races. A record number of “friend-of-court” briefs were filed in support of the case by hundreds of organizations representing academia, business, labor unions, and the military, arguing the benefits of broad racial representation at all levels of society.
See also 2003 Month-by-Month Current Events; 2003 People in the News.