A Forgotten Holocaust: US Bombing Strategy, the Destruction of Japanese Cities and the American Way of War from World War II to Iraq [*]
Mark Selden
World War II was a landmark in the development and deployment of technologies of mass destruction associated with air power, notably the B-29 bomber, napalm and the atomic bomb. An estimated 50 to 70 million people lay dead in its wake. In a sharp reversal of the pattern of World War I and of most earlier wars, a substantial majority of the dead were noncombatants. [1]
The air war, which reached peak intensity with the area bombing, including atomic bombing, of major European and Japanese cities in its final year, had a devastating impact on noncombatant populations.
What is the logic and what have been the consequences—for its victims, for subsequent global patterns of warfare and for international law—of new technologies of mass destruction and their application associated with the rise of air power and bombing technology in World War II and after? Above all, how have these experiences shaped the American way of war over six decades in which the United States has been a major actor in important wars? The issues have particular salience in an epoch whose central international discourse centers on terror and the War on Terror, one in which the terror inflicted on noncombatants by the major powers is frequently neglected.
Strategic Bombing and International Law
Bombs had been dropped from the air as early as 1849 on Venice (from balloons) and 1911 in Libya (from planes).
...
The strategic and ethical implications of the nuclear bombing of Hiroshima and Nagasaki have generated a vast contentious literature, as have German and Japanese war crimes and atrocities. By contrast, the US destruction of more than sixty Japanese cities prior to Hiroshima has been slighted both in the scholarly literatures in English and Japanese and in popular consciousness in both Japan and the US. It has been overshadowed by the atomic bombing and by heroic narratives of American conduct in the “Good War”, an outcome not unrelated to the emergence of the US as a superpower. [5] Arguably, however, the central technological, strategic and ethical breakthroughs that would leave their stamp on subsequent wars occurred in area bombing of noncombatants prior to the atomic bombing of Hiroshima and Nagasaki. A.C. Grayling explains the different responses to firebombing and atomic bombing this way: “. . . the frisson of dread created by the thought of what atomic weaponry can do affects those who contemplate it more than those who actually suffer from it; for whether it is an atom bomb rather than tons of high explosives and incendiaries that does the damage, not a jot of suffering is added to its victims that the burned and buried, the dismembered and blinded, the dying and bereaved of Dresden or Hamburg did not feel.” [6]
If others, notably Germany, England and Japan led the way in area bombing, the targeting for destruction of entire cities with conventional weapons emerged in 1944-45 as the centerpiece of US warfare. It was an approach that combined technological predominance with minimization of US casualties in ways that would become the hallmark of the American way of war in campaigns from Korea and Indochina to the Gulf and Iraq Wars and, indeed define the trajectory of major wars since the 1940s. The result would be the decimation of noncombatant populations and extraordinary “kill ratios” favoring the US military. Yet for the US, victory would prove extraordinary elusive. This is one important reason why, six decades on, World War II retains its aura for Americans as the “Good War”, and why Americans have yet to effectively come to grips with questions of ethics and international law associated with their area bombing of Germany and Japan.
The twentieth century was notable for the contradiction between international attempts to place limits on the destructiveness of war and to hold nations and their military leaders responsible for violations of international laws of war (Nuremberg and Tokyo Tribunals and successive Geneva conventions, particularly the 1949 convention protecting civilians and POWs) and the systematic violation of those principles by the major powers. [7] For example, while the Nuremberg and Tokyo Tribunals clearly articulated the principle of universality, the Tribunals, both held in cities that had been obliterated by Allied bombing, famously shielded the victorious powers, above all the US, from responsibility for war crimes and crimes against humanity. Telford Taylor, chief counsel for war crimes prosecution at Nuremberg, made the point with specific reference to the bombing of cities a quarter century later: [8]
Since both sides had played the terrible game of urban destruction—the Allies far more successfully—there was no basis for criminal charges against Germans or Japanese, and in fact no such charges were brought . . . . Aerial bombardment had been used so extensively and ruthlessly on the Allied side as well as the Axis side that neither at Nuremberg nor Tokyo was the issue made a part of the trials.
From 1932 to the early years of World War II the United States was an outspoken critic of city bombing, notably but not exclusively German and Japanese bombing. President Franklin Roosevelt appealed to the warring nations in 1939 on the first day of World War II “under no circumstances [to] undertake the bombardment from the air of civilian populations or of unfortified cities.” [9] Britain, France and Germany agreed to limit bombing to strictly military objectives, but in May 1940 German bombardment of Rotterdam exacted 40,000 civilian lives and forced the Dutch surrender. Up to this point, bombing of cities had been isolated, sporadic and for the most part confined to the axis powers. Then in August 1940, after German bombers bombed London, Churchill ordered an attack on Berlin. The steady escalation of bombing targeting cities and their noncombatant populations followed. [10]
...
http://www.japanfocus.org/-mark-selden/2414