Fighting Fair

The laws of war and how they grew.

As captured al-Qaida fighters fill the prisons of Guantanamo Bay, Cuba, the United States is planning to try them not as criminals, as we tried terrorists in the past like Ramzi Yousef, the 1993 World Trade Center bombing ringleader, but as war criminals. That means they won’t be tried under U.S. criminal code for mass murder; they’ll be charged with violating the “laws of war,” a body of rules that the world’s nations have collectively crafted over 137 years of conventions in Geneva and The Hague—but enforced with only sporadic success.

The idea that certain fixed laws should apply even amid the violence and anarchy of war isn’t new. The saying may have it that all’s fair in war, but restrictions on battlefield conduct have always been recognized. The Hebrew Bible forbade soldiers from, among other things, destroying fruit-bearing trees in hostile lands, and chivalric codes existed in the Middle Ages. It was the Dutch philosopher Hugo Grotius (1583-1645), however, who came to be seen as the Solon of today’s laws of war. His influential 1625 work On the Laws of War and Peace argued that there exist natural laws, independent of any individual state’s legal system, that are apparent to human reason and should prevail even during hostilities.

Over the next two centuries, Enlightenment philosophers built upon Grotius’ ideas. Montesquieu argued against the killing of prisoners of war. Rousseau insisted that prisoners receive humane treatment and be freed after hostilities ended (instead of being enslaved). These ideas made their way into various friendship pacts signed between nations and the war manuals individual countries issued to their soldiers.

The unprecedented carnage of 19th-century warfare led Western nations to try to mitigate the most unnecessary forms of battlefield suffering. Innovations in weaponry and the advent of “total war,” which exacted unconditional surrender from the defeated party, had made combat deadlier than ever before. In the Civil War (1861-1865), 600,000 American soldiers died—more than in all other American wars combined, before or since. In the Crimean War (1853-1856), in which Russia fought England, France, Turkey, and Sardinia, more than 100,000 people died of typhus, cholera, dysentery, and other diseases.

In 1864, prodded by the recently created International Committee of the Red Cross, 12 European nations met in Geneva, Switzerland, to devise rules for treating the sick and wounded in battle. The resulting Geneva Convention for the Amelioration of the Condition of the Wounded in Armies in the Field was signed on Aug. 22, 1864, and ratified by the United States—18 years later. The 1864 accord stated that ambulances and hospitals were to be regarded as neutral; that hospital workers and patients were to be unmolested; that wounded or sick soldiers would receive medical care, regardless of their loyalties; and that the Red Cross could freely travel through combat zones to aid the wounded. The convention marked the first of many treaties establishing laws whose violation would be taken to constitute war crimes.

After the Geneva Convention came a series of conferences in The Hague in 1899 and 1907, which established a vastly more comprehensive body of wartime rules. These treaties still make up the essence of today’s international law. They required that prisoners receive decent food, shelter, and clothing; that guerrillas and other citizen-soldiers obey the same laws as official military personnel; that combatants respect institutions devoted to religion, charity, education, art, and science; that surrendering enemies not be killed or injured; that defenseless towns or buildings not be attacked; and that soldiers not pillage or confiscate property. Other accords restricted the use of certain weapons.

As the Crimean and Civil wars prompted the first Geneva Convention, World War I, which saw record death tolls due to trench warfare and the use of poison gas, spurred the world to bolster what were now seen as inadequate existing rules. The Kellogg-Briand Pact of 1928, signed by most of the world’s major powers, renounced war as a tool of policy; though symbolic, it formalized the notion that aggression against another state violated international law and was cited in the post-World War II trials of Axis leaders. Also adding to the burgeoning body of law were the 1925 Geneva Protocol, barring the use of poison gas and biological weapons; the 1929 Geneva Convention, improving the rules on the treatment of POWs and the wounded; and other treaties regulating the wartime use of submarines.

Buttressing the laws of war, alas, did not check the human instinct for savagery, as World War II and the Holocaust showed, and the war’s aftermath produced a slew of efforts to stop such horrors from recurring. In 1948 the new United Nations ratified the Universal Declaration of Human Rights, which asserted that all human beings should enjoy a host of civil and social freedoms. Also that year an international genocide treaty outlawed any effort to exterminate a people. (The United States ratified it in 1988.) And the 1949 Geneva Conventions again expanded the laws of war, extending protections to noncombatants and augmenting the rights accorded to POWs and the wounded. Almost every nation on the globe has now signed the Geneva Conventions.

With the continual refinement of the laws of war and their near-universal adoption, their general thrust is by now fairly clear-cut (though the details, like those of any laws, remain open to interpretation). In contrast, questions of how to enforce international laws remain nettlesome. Until World War II, efforts to try war criminals had generally proved unsatisfactory. After World War I, the Versailles Treaty required Germany to let the Allies try its war criminals, but the Allies let the German Supreme Court conduct the trials, and only 13 of 900 defendants were convicted. Unless the world could develop some consistent, principled, and effective methods of enforcement, all The Hague and Geneva Conventions would remain little more than an interesting sheaf of papers.

One method of enforcement was based on ancient principles: to require individual countries to do so. (Aut punier aut dedere, the Romans said; either punish or let others punish.) The 1949 Geneva Conventions stipulated that signatories must search for suspected war criminals and either try them or turn them over to other nations willing to do so. Occasionally states have done so, as the United States did during the Vietnam War, prosecuting American servicemen accused of violating the laws of war. But on the whole, states have rarely been willing to charge their own soldiers with war crimes, and even more rarely let other countries or international courts do the work. The recent flap over the extradition and trial of Chile’s former dictator Augusto Pinochet underscored the difficulty of getting nations to agree to what in principle should be a workable system.

The other method pioneered after World War II was the international tribunal. The Nuremberg and Tokyo trials of Axis leaders were conducted by multinational courts under procedures widely deemed to be fair and open. Dozens of German and Japanese officials, including Hermann Goering and Hideki Tojo, were convicted and given the death penalty or other stiff sentences. Although limited in scope, the trials offered hope that the laws of war might for the first time carry real weight.

There were critics, to be sure, who decried the Nuremberg and Tokyo trials as “victors’ justice”—which of course they were. But the World War I experience had shown the futility of letting the defeated try their own, and in fact the international court represented a great advance over past forms of victors’ justice, such as show trials or summary executions (which Winston Churchill at one point recommended for the Nazi leaders). As U.S. Supreme Court Justice Robert Jackson, America’s chief prosecutor at Nuremberg, noted, “That four great nations, flushed with victory and stung by injury, stay the hand of vengeance and voluntarily submit their captive enemies to the judgment of the law is one of the most significant tributes that power has ever paid to reason.” The question was not whether the victors would mete out justice, but what kind of justice they would mete out. (Besides, hundreds of lesser war criminals were tried in German courts—as well as French, British, Polish, Dutch, American, and Soviet ones.)

But in the end international courts proved as feckless as national courts in enforcing the laws of war. Indeed, since the Tokyo trials ended in 1948, no international courts existed for 45 years—making it hard to use them to punish war criminals. The Bosnian and Rwandan genocides of the 1990s did lead the United Nations in 1993 and 1994 to set up courts for trying the perpetrators of those crimes, and in 1998, 120 nations (the United States not among them) agreed to establish a permanent international court. These moves have offered some hope that the laws of war might be given real bite, but their long-term effectiveness remains uncertain. After all, it’s hard to imagine that suspected war criminals from powerful countries—whether Russian butchers in Chechnya or Henry Kissinger in the United States—will ever be hauled before an international tribunal. Who would go and arrest them?

For now, then, the best we can hope for is that the al-Qaida captives will be tried in accordance with the Geneva Conventions and for crimes that clearly violate the established laws of war. If so, the laws themselves will continue to have meaning, even if the ability to enforce them remains a function of raw might.