Month: May 2014
Page 1 of 2

Two millennia ago, the Roman Empire ruled over most of Europe, Christianity had not yet come into existence, and the global population is estimated to have been in the hundreds of millions. Two millennia earlier, complex systems of trades and significant urban centers were already existence in Egypt, the Middle East, and China; by then, bronze and early iron tools had been used in various parts of the world for hundreds of years. Another tool, writing, had emerged in its earliest forms in Sumeria, predating bronze tools by 3 millennia and marking the beginning of recorded history. Another 5 millennia earlier, the cultivation of wheat and barley had begun in Mesopotamia. Twenty millennia prior to this, Europe had been populated by humans, as had North America. The first cave paintings and earliest evidence of fire and tool use are dated to 20 or 30 millennia before that. The species Homo sapiens left its birthplace of Africa at least 100 millennia earlier.

The average human lifespan is around 80 years old, a mere 8 percent of a single millennia. If I live a full life, I will have experienced less than .001 percent of human history. I am part of a far grander, far longer narrative. I am a human in the year 2014, a member of modern civilization, living 150 millennia since the birth of our species.

I am alive, descended from a common ancestor with modern Chimpanzees, a divergence that took place around 5 million years ago. The earliest primates developed at least 65 million years ago from a branch of the mammal clade, which itself came into being at least 250 million years ago. Life first walked on land 350 million years ago. The first animals, in the forms of early fish and bilaterians, first appeared 600 million years ago. Life on Earth, in the form of simple cells, has existed for the last 3.5 billion years. The Earth upon which they lived is 4.6 billion years old. The universe in which the Earth exists is 13.7 billion years old, three times the age of our planet. Through the vast breadth of history that predates me, my body has developed into the tool I use to interact with reality. I encounter the world around me through senses that have been developing for millions of year. The brain which endows me my intellect is the product of millions of years of evolution. My species relies upon its heritage for its character, a heritage it is unable to comprehend.

Astronomers predict that the universe will continue to exist in its current form and function for at least another 100 billion years. By then, our star will have been dead for tens of billions of years, along with all of the life that lived on this planet. Earth, now in the year 4.6 billion of an expected 100 billion, is a young planet, and my species, in only its 150,000 thousandth year, is a young species. I have appeared at the very beginning of existence.

As a human being, I am a species that over millennia of development has evolved an ability to recognize and to define its sense of selfhood, through a process of conscious, directed thought. I think and act because of the mechanical processes at work within my physical self, mostly but not entirely localized to my brain, because of the firings of my neural synapses. As with other animals, my behavior is largely determined by the subconscious mind and instinctual programming. Through sensory input I perceive local stimuli and, having consciously recognized it, produce a response. Is this much different than the instinctual actions of a dog, or a squirrel, or an ant? When I fall asleep, I dream. So too do dogs and cats. I recognize loved ones and grieve over their death. So too do elephants and apes. I can approach problems, recognize patterns, and produce complex solutions. So too do crows. The human experience is created in the same manner that any other animal experiences life. My understanding and experience of life is distinctly human, but it does not stand alone. As I play with my pets, I wonder if they recognize their selfhood. It is part of my selfhood to be able to ask this question, but my brain is not sufficiently evolved to know whether their brains, and their programming, allow them to develop this thing my species calls selfhood. Even if they do not, they must be aware of the world around them; they too experience being. I realize that my experience, the human experience, is only one of an array of possible ways of being. I am a human being, but I am part of a far grander, far more diverse world of life with which I share most of my DNA.

I share this world with 7 billion other individual beings of my kind. I have my own fears, hopes, dreams, ambitions, desires, worries, and anxieties. I have my own personality, my own sense of humor, and my own sexual preferences. I have my own prejudices and hatreds. So too does everyone else. There is nobody in the world exactly like me, just as I am not exactly like anyone else in the world. Yet, as a human being, I am far more similar to everyone around me than I generally acknowledge. From this recognition comes my ability to sympathize, to empathize, and to relativize. There are many flavors of human character, but ultimately we are all driven by the same motivations and urges. How could it be otherwise?

I am a white, Jewish American with European heritage living in North America. My cultural, social, ethnic, and environmental circumstances, the very essence of my identity and my selfhood, are the products of mass migrations, geographic changes, and racial interactions. I have spent my entire life in the American Northeast, but my blood has no connection to this land. My family, scattered across the Old World because of the diaspora, immigrated to America from Germany and Russia in the early 20th century. Before that, they lived in small Orthodox Jewish communities in Poland and the Ukraine, mixing and intermarrying with the local population. Before that, they lived, oppressed, in Spain. Before that, they lived in Israel. The Jewish people, my people, can trace their history and heritage back four thousand years to Israel’s earliest tribes. Yet before that identity was constructed, they were Semites living in the Middle East. Before that, they were proto-Semitic migrants from North Africa. Before that, they were early human migrants from the Great Rift Valley in Africa. My blood is an amalgam of different peoples from different places with different cultures. My blood comes from across the world.

My people have been the target of hatred and oppression for most of their history. Jews have been called sub-human, worse than vermin. Because of my heritage, I myself have been the target of hatred for parts of my life. Because of my heritage I am, for some, a source of disgust. My identity is an insult. But my identity is a construct, developed over thousands of years of migration and interbreeding and cultural development. Those who despise me shared my lineage thousands of years ago. Those who despise me were born from the same process of biological and geological development, from the same course of events which brought me onto this Earth. How can my Jewish identity be such an insult, when it is only a sliver of what composes my human identity? How can “otherness” exist, when the characteristics which separate us are only a sliver of what composes our human identity? I am a Jew, but more so, I am a human. I am despised for my differences, but my differences are ultimately insignificant. I cannot bring myself to see other cultures and people as the “other.” The “other” are still human beings, their differences only part of a far grander, far longer narrative.

I live in an upper-middle-class household in this 1st century after The American Century, in a democracy that has produced the highest per capita wealth in human history. I attend a college whose annual tuition is higher than the poverty line. I live in the most powerful country on Earth, and see the world through the lens of national exceptionalism. Unlike most of the peoples of the world, living now or before me, I have never struggled with hunger, thirst, homelessness, or need. I am distanced from the horrors of war, genocide, disease, prejudice, and oppression. I owe these circumstances to my parents, who I did not choose. I owe these circumstances to the location in which I live, which I did not choose. There are children starving in Africa because they were born to parents who they did not choose. There are civilians dying in Syria because they are living in a location that they did not choose. Millions of human beings live in poverty because of circumstances outside of their choice or control. My existence is privileged, by virtue of random lucky chance.

In such a life of comfort, an overwhelming sense of complacency arises. I do not struggle, for I was not born into circumstances in which struggle is necessary for life. Yet I recognize that I owe this to luck, and that not everyone who lives on this planet was fortuned by such luck. I live in a culture that teaches me (but not all of my peers) that good fortune makes it my moral responsibility to give to charity, to volunteer, to serve humanity. I make it my duty to do good works for those who are not fortuned by good circumstance. Elements of my cultural heritage teach me to ask, why should I live in comfort while others struggle, when I have done nothing to deserve this comfort? Yet elements of my cultural heritage teach me to ignore the plight of others, to live satisfied in the comfort I possess. There is a constant struggle between the norms of my society, from which the concept of the unimportant “other” is produced, and my own moral standards. Still, I cannot see myself as a moral human being if I accept the suffering of others without action by justifying their circumstances as simply bad luck.

Like most people living In this place on this planet in this era in this country in this socioeconomic context, when I wake up in the morning I turn off my alarm and brew a cup of coffee. The electricity used to power these tools has been used by humanity for only a little more than a century, but for me they have always existed. They are a fundamental part of my life, allowing me to live with the comfort and ease I have come to expect. I dress myself in a style of fashion that has been the norm for only a decade, a fact so ordinary that I can barely bring myself to recognize it. To get to work, I commute in my car, an invention less than a hundred years old. I work on my computer, a device only 40 years old, and access the internet, which has been around for less than 20. But, again, for me, these have always existed. I have never known life without these tools. I have no concept of an identity without interaction with technology, and it is possible that my very cognition has begun to transcend my body and inhabit my tools, but perhaps such a process has a longer history than I am inclined to see. Back home, I turn on my color television, first sold half a century ago; start my videogame console, first sold half a decade ago; play my favorite video game, sold only a year ago; while chatting with my friends through my smartphone, an invention just as young, using lingo and slang so contemporary to my generation that people born only two decades earlier can barely interpret it. In school, I study philosophical and political doctrines that have existed for mere centuries, and some even less than that but often taught as “old”. The humanistic, secular worldview I hold is a modern phenomenon. I do not need to worry about my atheism, a belief once so heretical that, a thousand years ago, it would have been cause for my execution. My way of life is a new invention; nearly everything I know and nearly everything I use has come into existence in recent history. I can choose my own spouse and have a choice in how many children I want, a privilege that has only recently become norm. I can choose what state or continent I wish to live in, an ability that has only recently been available to the average individual. I do not need to forage for food or search for shelter, having been born into a society that readily offers them. I buy food in a store, food that is planted and picked and transported by people I will never meet, in fields where I will never set foot, using a currency of exchange that I rarely handle in its physical form. I am part of modern civilization, socialized and conditioned to its norms, procedures, and elements. Yet modern civilization is fleeting; it is part of a far grander, far longer narrative.

Would Cody Knipfer have succeeded in the distant, or even recent, past? Would I have been the person I am today, holding the perspectives and values I have today, had I been born in the past? Everything I know and think is the product of modern civilization. How fluid our identities must be, if they can be entirely transformed simply by the place in time in which we live. Can we truly ever know ourselves separate from the context of place and time? Are these integral parts of our identity, of our conception of selfhood? Cody Knipfer is a 21st century human being. Had I been born in any other time, I would not have been me. I would have been shaped with different values, socialized to different norms, given a different perspective. The inherent workings of the world, categorized and described by the human constructions of science, religion, and philosophy, would have looked and operated very differently to me. Such recognition troubles me. I do not know who Cody Knipfer is as he exists in the state of nature. Do I, then, really know Cody Knipfer?

People alive today remember the names and deeds of Adolf Hitler, John F. Kennedy, or George Washington. These individuals are not too far removed in place or time to have been forgotten. Yet fewer people remember the names of William de Normandy, conqueror of England, or Pepin the Short, father of Charlemagne. Fewer still remember the names of the leaders of the Roman Republic as it approached the height of its conquests, ruling over an empire far greater than any that exists today. Only the most serious scholars will recall the names and deeds of the earliest Egyptian pharaohs or the earliest kings of Mesopotamia, who commanded great armies and were regarded as living deities. Nobody knows the names of the first tribal leaders and chieftains, who oversaw the beginnings of human civilization.

The lives of the countless generations that preceded civilization are almost complete mysteries to us. Despite their deeds, their names are lost forever to history, forgotten in the great distance in time that has since past. For even history, the narrative of human civilization is an invention of the recent past. How many countless souls I aspire to greatness; my ambition is the driving force behind my thoughts, my words, and my actions. I want my name to be in the history books, my deeds to be remembered for ages. Despite my deeds, will my name too be lost as my civilization ages? Will I too be forgotten as hundreds, and then thousands, and then hundreds of thousands of years pass? The memory of my existence is fleeting. I am part of a far grander, far longer narrative.

It is a natural desire to want to leave a legacy. It is how I feel my accomplishments and contributions are given value. It is how I feel my life is given weight. I have won numerous awards in school and accolades in my work life. I have worked hard through stress, or what I have been socialized to perceive as “stress,” to achieve these distinctions. Yet just as my mortal body will pass away, so too will my legacy. The vastness of time and space will inevitably swallow up every accomplishment humankind has ever made. Should I struggle as hard as I do to leave a legacy which will eventually be forgotten? I find myself torn between a nihilistic sense of futility and a belief – a belief owing to a cultural heritage with origins in those “ancient” philosophers– that my work, regardless of its ultimate fate, will have helped somebody. I cannot control the vastness of time. I cannot ensure the survival of my memory. I can, however, make a tangible difference in someone’s life. My identity is one which recognizes its insignificance on the macro-scale, but which recognizes its importance on the human scale.

I am Cody Knipfer, living in the United States of America on the planet Earth in the Orion Arm of the Milky Way galaxy in the Local Group of the Virgo Supercluster 13.7 billion years after the creation of the universe. I am an individual human being among 7 billion others, living among millions of other forms of life. I am the product of chance, probability, and change entirely outside of my control. My identity is developed by the factors in my environment which I do not or cannot perceive. I am a part of a far greater narrative.

Warfare has been a common element in interstate conduct since the dawn of civilization, and has been constantly evolving and changing as technology and strategy develops. Like many other facets of international interaction, the conduct of warfare has been subject to a number of limitations, treaties, and agreements concerning its nature. Similarly, the targets of war and weapons employed have too been limited and selectively chosen by the international community. The modern conduct of war is now limited and regulated by a wide array of treaties and laws, though the governing of war has had a long and varied history. Understanding the sources and history of these rules of war, as well as why they exist, serves to help international legal scholars understand the conduct of nations during conflict.

The sources of the laws of war are similar to the sources of all international laws and regulations: treaties and international conventions, international customs and traditions, and general principles of law recognized by civilized nations. International treaties and conventions covering issues related to war have massively shaped the nature and conduct of war in the past century, especially as the binding nature of treaties has increased substantially during that period. These treaties, however, have largely codified previous international customs and traditions regarding warfare. These customs are developed as large numbers of states adhere to the same principles or practices and view these practices as generally obligatory. Additionally, states develop laws of war around general principles, which are commonly accepted beliefs and truths about the nature of war. These principles include a belief that violence should not be allowed beyond the point of necessity, and that a degree of ‘chivalry’, which demands a certain amount of fairness and mutual respect between the opposing forces, be maintained.[1]

Armed conflict has been around since the dawn of civilization and with this armed conflict has come laws and agreements governing the use of force. The earliest known examples of governing the conduct of war are found in the Hebrew Old Testament. Deuteronomy 20: 19-20 limited the amount of acceptable collateral and environmental damage during a siege, while Deuteronomy 21: 10-14 required that female captives could not be sold off as slaves.[2] Early Christian thinkers theorized about the ethical nature of war, and attempted to codify what they determined as a ‘just war’. The ideas of St. Augustine would later be used by the priest Thomas Aquinas to determine what circumstances made a war just. He argued that a war must occur for good and a just purpose instead of self-gain or as an exercise in power, the war must be waged by and against a properly instituted authority such as the state, and that peace must be a central motive even in the midst of violence.[3] In the 7th century Arab world, the first Muslim Caliph, Abu Bakr, commanded his army to spare women, children, and the elderly, to leave trees standing, not to ravage the enemy’s livestock, not to mutilate dead bodies, and to leave monks alone. Furthermore, scripture in the Koran commanded Muslims only to strike their enemy in self-defense, and to spare enemies on the retreat.[4]

Laws governing the conduct of war were present in East Asia throughout much of history, as well. In the Chunqui Period of ancient China, feudal lords organized conferences on the elimination of war and disarmament in the Chinese state of Song. During the Warring States period, rules existed amongst the seven feudal states regarding the declaration of war, the conclusion of peace, and the favorable treatment of prisoners of war, as well as rules that called for the sparing of retreating enemies and the elderly, prohibitions on surprise attacks and ruses, and protection for the wounded.[5] These rules, while setting precedents regarding conflict and codifying norms which spread across the Asian region, were not strictly adhered. Historical accounts of a battle during this era describe how the King of Qin ordered the live burial of as many as 400,000 prisoners of war, for example.[6]

Early Medieval movements in Christian Europe also attempted to limit the scope of warfare and protect noncombatants. The ‘Peace and Truce of God’ was a movement by the Catholic Church that applied spiritual sanctions in order to limit the violence of private war in the feudal society. Its origins were in the years following the collapse of the Carolingian Empire in the ninth century, during which time France had devolved into many small feudal holdings fighting against each other. The Truce of God attempted to intimidate nobles and knights into committing to peace and renouncing private wars by intimidating them with sacred relics. Certain days of the year were also set aside where violence was not allowed. The Peace of God was a proclamation issued by local clergy that granted immunity from violence to defenseless noncombatants. While the movements were unsuccessful in attempting to control warfare in the violent reality of feudal society, the precedents set by them (sparing noncombatants and controlling violence) would permeate through Medieval society and be attempted by successive movements.[7] During the 14th and 15th centuries, the institution of neutrality began to develop. Treaties containing provisions on neutrality were signed, initially meaning that one party refused to aid belligerents in war, and then including stipulations that belligerents should not attack the neutral subject. Additionally, maritime military law began to develop during the Middle Ages. ‘Prize Law’, which covered the plunder of captured enemy ships and maritime trade, was codified during this period. Prohibitions on the trading of goods with the enemy or the transportation of goods on behalf of the enemy were created, as well as provisions on the fate of captured ships. Additionally, the institution of privateering began around the 14th century, and decrees were issued outlining the methods of granting permits to privateers that would render them legitimate.[8] Hugo Grotius, a Renaissance international legal scholar, would discuss the law and nature of war in his “De Jure Belli ac Pacis” in 1625. He identified three ‘just causes for war’ as being self-defense, reparation of injury, and punishment. He further identified rules that govern the conduct of war, and that all parts to war are bound to those rules, whether their cause is just or not.[9]

The invention and development of firearms in the 18th and 19th centuries and more advanced science and technology accelerated the advent of new ways of war, as well as the increased killing potential and brutality of conflict. In response to this, the conduct of warring entities was standardized in order to limit to decrease the destructive capability of war. The 1865 Declaration of Paris abolished privateering, made enemy goods shipped under a neutral flag a non-target, and made blockades binding only if they are effective in maintaining a force sufficient to prevent access to the enemy coast. The First Geneva Convention of 1864 dealt with the humane treatment of the wounded and the sick in armed forces on the field. It protected soldiers from inhumane treatment, that the dead should have their information and identity recorded, that this information should be transmitted, and that impartial humanitarian organizations should provide protection and relief for the wounded. The St. Petersburg Declaration of 1868 banned the use of fragmenting, explosive, and incendiary small arms ammunition. Delegates affirmed that the only legitimate object of war should be to weaken the military force of the enemy. The first Hague Peace Conference followed this on May 18, 1899. It was convened with 26 participating states, and its purpose was to ‘limit arms and to safeguard peace’. The Second Geneva Convention in 1906 dealt required belligerent parties to protect and care for the shipwrecked, prohibited the capture of neutral vessels, and made hospital and humanitarian ships non-targets. The Second Hague Peace Conference of 1906-1907 concentrated on the issue of laws of war. It prohibited the launching of projectiles and explosives from balloons and included conventions which stipulated that war could not begin unless there is advanced and explicit warning, made distinctions between combatants and noncombatants, called for the humane treatment of prisoners of war, prohibited the destruction or confiscation of enemy property, and protected cultural and historical landmarks as well a medical facilities. Additional conventions stated that, during occupation, the occupying power must restore and ensure public order and safety as well as the laws of the nation they are occupying and that neutrality must be respected. Many laws of naval war were also codified, including the status of enemy merchant ships, the laying of submarine mines and bombardment, the obligations of neutral powers during naval war, and the issue of blockades. The London Naval Conference of 1908-1909 added to these laws of naval war. The Geneva Convention of 1929 dealt with the definition of, the protection of, and humane treatment of prisoners of war, specifying the type of labor they were permitted to do and the terms of their captivity.[10]

The Second World War was a catastrophic event for human society, and each of the belligerent powers violated the laws of armed conflict. At the end of the war, representatives of a large number of states converged in Geneva, and deliberated on the formulation and regulation of new laws on conduct during war. On August 12, 1949, 63 states signed the new Geneva Conventions. These new articles updated and supplemented all of the prior Geneva conventions. It covered the affirmation that the sick and wounded of any party shall receive impartial treatment, and are prohibited to be killed, tortured, or subject to experimentation, that naval medical personnel, ships, and wounded must be protected, that civilians in the power of a belligerent party shall receive protection and humane treatment, that civilian settlements must not be destroyed, and that social norms (cultural, religious, etc.) must be respected. The rules included in the Conventions of 1949 are applicable to all armed conflicts, declared or not, each convention is binding to each belligerent party in a war, regardless of whether all parties are signatory, the rules adhere in conflicts of intranational scope, such as civil wars, and guarantees that the victims of such wars receive minimum protection.[11] Additionally, in 1977, representatives from all of the states gathered in Geneva and signed the ‘Protocols Additional to the Four Geneva Conventions of 1949’. These protocols expanded the scope of the Geneva conventions to include situations to which the convention is applicable to armed conflicts over colonial domination, alien occupation and racist regimes, as well as largely increasing the number of provisions regarding the protection of unoccupied territories and civilians, and strengthening protection of civilians. The Treaty on the Non-Proliferation of Nuclear Weapons, which became a force of law in 1970, specified that states should not produce, and thus proliferate, nuclear weaponry, and that nuclear-armed states should begin to disarm their arsenals. The Anti-Ballistic Missile Treaty, signed into law in 1972, specified that the United States and the Soviet Union should limit their anti-ballistic missile arsenals, therefore allowing for the possibility of ‘mutually-assured-destruction’ and thus serving to reinforce the deterrent effect of nuclear weapons.

There have been many recent developments in the law of war. The Geneva Convention of October 10, 1980, on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effect, prohibited weapon systems that could cause too much injury or cause too much collateral damage. Included among these weapon systems were cluster-bomb munitions and landmines. In October 1995, the Vienna Diplomatic Conference issued the Protocol on Blinding Laser Weapons, which prohibiting the use and transfer of laser weapons whose functions are to cause permanent blindness. The Convention on the Prohibition on the Use, Stockpiling, Production and Transfer of Anti-Personnel Mines and on Their Destruction, signed in 1997 by 121 states, made the use of landmines illegal in war and discussed their destruction. The START treaties between the United States and Russia, signed in 1991 and reaffirmed in 2010, largely reduced the amount of strategic nuclear weapons that the two countries could possess, as well as other forms of strategic weaponry. These treaties would become the largest and most comprehensive arms reduction treaties in history.[12]

The development of the United Nations following the Second World War would change the way that states could declare and justify war. The charter of the United Nations declares that states should resolve all of their conflicts peacefully, theoretically eliminating the justification for war. However, Article 41 of the United Nations specifies, “nothing in the present Charter shall impair the inherent right of individual or collective self-defense if an armed attack occurs against a Member of the United Nations, until such time as the Security Council takes the measures necessary to maintain international peace and security”. Thus, nations are only justified in waging war if they are doing so in self-defense against an attack. Article 42 of the United Nations Charter stipulates that, “Should the Security Council consider that measures provided for in Article 41 [which discusses sanctions and diplomatic coercive measures] are inadequate or have proved to be inadequate, it may take such action by air, sea or land forces as may be necessary to maintain or restore international peace and security”. The Security Council of the United Nations thus operates as a governing council which decides when armed conflict is justifiable. Additionally, it is through the Security Council that the use of armed force can be called for or sanctioned. The significance of the Security Council and the United Nations is that they govern the declaration of war, and that all wars that are ‘acceptable’ to the international community must thus come with the support of the United Nations.[13]

Despite the codification and prevalence of laws of war, states and leaders are susceptible to violating them. In cases of intentional disregard for the laws of war, the development of international criminal justice can provide a safeguard and method to prosecute violators. In 1945, the Soviet Union, the United States, the United Kingdom and France agreed to establish an International Military Tribunal, and signed the Agreement for the Prosecution and Punishment of the Major War Criminals of the European Axis as well as the Charter of the International Military Tribunal. These Tribunals would, according to the charters, prosecute ‘crimes against peace’, which included the planning, preparation, initiation or waging of a war or aggression, or a war in violation of international treaties, agreements or assurances, ‘war crimes’, which are violations of the laws or customs of war, and ‘crimes against humanity’, which involve murder, extermination, enslavement, deportation and other inhumane acts committed against any civilian population before or during the war. The Nuremburg Trials against the defeat Nazi leadership and the Tokyo Trials against the Japanese leadership following World War Two were applications of these trials, and convicted much of the upper leadership of the defeated Germany and Japan of war crimes, setting a precedent for international trials against war crimes.[14] In 1993, the United Nations Security Council established an International Criminal Tribunal for the Former Yugoslavia which prosecuted the people responsible for violations of international humanitarian law since 1991 during conflicts in the Balkans. Additionally, in 1994, the United Nations Security Council established an International Criminal Tribunal for Rwanda to prosecute those responsible for genocide and other serious violations of international humanitarian law in Rwanda during its civil war and subsequent genocide. The passing of the Rome Statute of the International Criminal Court in 1998 was also an important step in ensuring that all states abide by international humanitarian law. The newly established International Criminal Court was given jurisdiction over all war crimes. These new precedents in the legal traditions of the international community regarding the laws of war mean that those who violate the accepted laws are susceptible to trial and prosecution by international legal bodies. The hope is that this will serve as a deterrent for the committing of further war crimes.

Thus, the international community has developed rules, norms, and regulations governing war in response to purposes and principles regarding conflict and in order to mitigate hardships and problems that arise from war. The law of war attempts to facilitate the realization of these principles, while also governing the conduct of war in order to prevent certain actions that could set them back. These principles include the end goal of ending hostilities quickly, hopes for limited wars, and the protection of people and property from conflict. The hardships that the laws of war hope to mitigate thus include the destruction of noncombatant property and life, the disregard for human rights, and the declarations of ‘unjust’ wars.

One of the primary guiding principles behind the laws of war is the hope that peace will be restored quickly. This is generally only possible when the belligerent parties fight wars of limited scope and intensity, and with limited goals. The nature of war means that states will use the maximum extent of their force in order to achieve their goals, while also reciprocating the actions that their enemies make. Because of this, laws are needed to govern the amount of force that can be used, and what tools and weapons can be utilized during wartime. Without these laws, states would not be prohibited from using their arsenals or developing strategies that would cause massive devastation and destruction in their adversary’s territory. Additionally, the reciprocal nature of war means that all of the belligerent parties could expect their territory to be devastated by the enemy. This “eye-for-an-eye” philosophy, if allowed to develop, would thus lead to sustained and harrowing losses on either party to any particular conflict. Because states will also use the maximum extent of their force in order to achieve their wartime goals, the laws of war hope to limit the scope, goals, and justifications for war. Laws governing when and how a state can declare war limits the reasons a state can use to declare war, and thus lessens the ‘availability’ of destructive conflict as a policy tool. The hope is that, through the limiting of the scope and scale of wars and thus the intensity of the conflict, diplomatic solutions to the issue being fought over can be found.

The principle of limited war is realized through the limitation of massively destructive weapons systems, as well as providing conduits through which states must go in order to declare war. The United Nations charters, and the United Nations Security Council, are now the governing and regulating bodies when it comes to the declaration and justification of war. States must receive the sanction of the Security Council prior to declaring war on other states, unless they have already first come under attack. Because of this, the amount of wars which are ‘legitimate’ in the international community should decrease. The Security Council adheres to the principles of a ‘just war’, meaning that the ‘casus belli’, or cause for war, should only be fought to redress a wrong which has been suffered, if the violence inflicted is proportional to the wrong suffered, if it is fought with a reasonable chance for success, if the ultimate goal is to reestablish peace, and if the war is only fought as a last resort.[15] According to these specifications, the amount of legitimate wars should be small, considering that most ‘wrongs’ could be resolved diplomatically or not considered as a reason for war, and the conduct and violence of those wars would be limited, for they should be proportional to the amount of force necessary to redress a wrong which was suffered. Limiting destructive weapons systems assists in limiting wars because they reduce the amount of destruction, collateral and intentional, which could be inflicted upon the enemy. The infliction of much destruction and damage would prompt an equal response from an adversary, and would likely escalate the conduct and scope of the war into a ‘total war’. Thus, by limiting destructive munitions or overly-lethal weapons systems, the conduct of war would remain light and limited, and the reciprocation between adversaries would therefore also be limited.

The protection of human life and civilian property is also an important principle behind the laws of war. The protection of civilians is conducive to the ultimate goal of peace: the destruction of cities and civilian property would make populations more dedicated to fight until the end, or make them more hostile towards the adversary. That aside, the protection of civilians is an important principle because of the way states view the nature of war. War is supposed to be fought only as a continuation of policy, and in order to achieve a political goal. The conduct of war is only to destroy and eliminate the adversary’s military capacity. The indiscriminate slaughter of civilians or the destruction of non-military property does not assist in the destruction of military strength, nor does it assist in the end goal of achieving a political policy. Rather, states view it as a simply inhumane act. It is because of this that civilian populations are protected under a number of conventions and treaties regarding the laws of war, and additionally weapons systems which inflict large amounts of collateral damage or do not explicitly target military targets are limited or prohibited under a number of regulations on the conduct of war.

The development of the laws of war has had a long and varied history, beginning in ancient times and continuing till today. The massive expansion of international laws, regulations, and organizations in the last century has assisted in expanding the scope of international laws of war as well as driven the codification and development of many regulations on the conduct and weapons of war. These developments are likely to continue, as more and more sophisticated weapons systems become developed and enter service and as new forms of combat and different adversaries begin to appear. Additionally, these laws of war will change and bend according to the changing nature and participants of war. Recent developments in the legal aspect of the international community, such as the development of the International Criminal Court and the Nuremburg and Tokyo Trials following World War Two, means that violators of the laws of war will be prosecuted for their actions. A legal precedent has thus been set which will help enforce the laws of war, and will make potential violators more weary of disregarding them.

The laws of war are now enforced by legal international councils and bodies, and are largely regulated by international laws and regulations. States adhere to the accepted rules of conduct in fear of reciprocation by their adversaries as well as their international reputation. Leaders fear prosecution under international tribunals should they violate the laws of war. These laws serve an ultimate purpose of realizing a number of general principles accepted by the international community regarding war and its conduct: the realization of peace, the protection of human life and civilian property, the concepts of ‘just’ wars, and wars of limited scope and scale. Because of the laws of war, the world has seen a decrease in the lethality and destructive ability of armed conflict, as well as general decline in the amount of wars fought between states.

Works Cited

[1] General Principles & Sources of The Law, Law of War, http://lawofwar.org/principles.htm, Accessed 20 Apr. 2012

[7] Jordan, William Chester. Europe in the High Middle Ages. London: Viking, 2003.

[8] Firdman and Bastin, History of International Law. Moscow: International Relations Press, 1990

[9] The Law of War and Peace, trans. Francis Kelsey. Carnegie edition, 1925, Prol. sect. 28.

[10] International Committee of the Red Cross, International Law Concerning the Conduct of Hostilities: Collection of Hague Conventions and Some Other International Instruments. Geneva: International Committee of the Red Cross, 1989, p. 69.

[11] Roberts and Guelff, eds, Documents on the Laws of War. Oxford: Oxford University Press, 2000.

During the 2012 American presidential campaign, the Republican frontrunner Mitt Romeny accused his opponent, President Barack Obama, of not believing in “American Exceptionalism”. He continued by questioning Obama’s “commitment to the view of America as a unique and unrivaled world power.”[1] While Romney’s accusation was partly a move to pander to the Republican constituency and discredit Obama, it raised interesting questions about the philosophical nature of political exceptionalism and the ramifications of its permeation through American society. A philosophical study of exceptionalism, especially in American society, can reveal the true nature of our political and social culture as well as the ethical norms and values we possess when viewing the world.

Exceptionalism describes the perception of a country or society in a certain time period that it is ‘exceptional’ in some away and thus does not need to conform to general rules, norms, or principles. In order to arrive at this perception, however, ethical values, political viewpoints, and philosophical realities such as nationalism, cultural imperialism, xenophobia, and ethnocentrism need to be present and come into play. The simple act of considering a country or society ‘exceptional’ means that it is being judged and valued above other countries and societies. If all were valued equally or looked at in a relativistic manner (that is, the true value and worth of a different country or society could only be judged accurately by a member of its own culture and society), then there would be no ‘exceptional’ country. By approaching the concept of social and national worth using a relativistic or egalitarian frame, it would be unreasonable for a person to expect to accurately judge that their society was superior to others. Doing so would be impossible: they lack the capacity to accurately judge the worth of other societies, and thus could not weigh them against their own. Rather, they would have to accept that there can only be a system of equal, albeit culturally and institutionally different, societies and countries. The fact that American political leaders and our political cultures espouse ‘exceptionalist’ values, however, demonstrates that this is not the case. Thus, the conclusion must be made that our perception of other countries and societies against our own is not drawn from a relativistic or egalitarian frame, but rather is being skewed by other philosophical viewpoints. Further, because we uphold our own society as being superior to others (hence why it is ‘exceptional’), these viewpoints must be ones which support the notion that something about our society is inherently and intrinsically better than others.

Arching over and intertwined with these viewpoints is the concept of nationalism. Nationalism is a political identity that involves a strong identification of an individual with a nation. With this identification comes the development of a loyalty and pride from the individual towards his nation, its culture, and its society. The United States has developed a strongly nationalistic culture, which is ingrained into the American youth and which is referred to in much of the political environment. Much of this nationalism is developed as a pride in our country’s civic and legal concepts and norms, as well as on a common language and cultural tradition. At a young age, American youths are taught the Pledge of Allegiance, the stories about the founding of our country, and about our ‘founding fathers’. The stories of ‘throwing off British oppression’, the romanticizing of events in the Revolutionary War such as the Boston Tea Party or Paul Revere’s ride, and the cults of personality built around founders such as Thomas Jefferson, George Washington, and Benjamin Franklin all help in the development of pride in our country’s origins. Lessons in how the United States serves as a bastion for the lost, the oppressed, and the helpless (consider the famous quote on the Statue of Liberty, “give me your tired, your poor”), as well as how we were the first country developed along liberal democratic guidelines, with due process of law, political representation, social equality, individualism, and capitalist tendencies, help develop a pride in our country’s civic and legal roots. It’s from this nationalistic pride in our country’s civics and society that large portions of our ‘exceptionalist’ outlook stem. The historian Gordon Wood argued this point by saying that, “Our beliefs in liberty, equality, constitutionalism, and the well-being of ordinary people came out of the Revolutionary era. So too did our idea that we Americans are a special people with a special destiny to lead the world toward liberty and democracy.” We are taught that the American ideals of liberty, freedom, and democracy originate uniquely from the United States, and that it is the American duty to lead the world towards embracing these ideals.

It is from this societal embrace of the ideals of democracy and liberty, and our perception of us as a ‘chosen’ people ‘destined’ to spread these ideals, that an underlying ethnocentrism is revealed in the American psyche and the overall resulting trends towards cultural imperialism demonstrated. Ethnocentrism describes the judgment of other cultures by the values and standards of one’s own culture. For us in the United States, we judge the merit and value of other countries and societies based upon our own civic institutions and legal history. For cases in which other countries are democratic in nature and have societies rooted in liberal philosophy, we can better associate ourselves with them, and generally form cooperative ties with these nations. They are relatively similar with our own society in how they are structured and the civic values they possess, and thus we weigh these countries with relatively high value. However, even then, these societies are unlike our own in that they lack the American ‘nature’ to spread democracy and liberty to other countries. Hence, we do not perceive them as ‘exceptional’, nor do we believe that they are ‘destined’ countries like our own. American nationalism, always floating above perception of other societies, thus ties into this judgment: even though these other countries are like our own in the values that we take pride in, they are not American, and thus they lack the American ‘spirit’ and the American ‘destiny’ to spread them. Therefore, they are inherently lesser. What are of even more philosophical interest are the cases where the society we are judging does not mirror our own in their societal values. In such cases, we immediately discard it as a society with less worth than our own. Cases like these would be for autocratic and nondemocratic countries, which lack the same legal and civic foundations that we so pride ourselves with in our country. The American response to such cases is to, as Wood previously argued, ‘lead’ these societies into democracy and liberty. Indeed, it is this sort of crusade for democracy that led previous administrations, such as George W. Bush’s, to engage in operations seeking democratic regime change abroad in Iraq and Afghanistan. The entire ‘Bush doctrine’, as it would come to be known, was designed to seek the spread of democracy across the world. Such a doctrine, and our world view it is the result of, is as great an example of cultural imperialism and desire for cultural hegemony as any.

Cultural imperialism is defined as the cultural aspects in the creation and maintenance of unequal relationships between civilizations favoring the more powerful civilization. For the United States, the exportation of American values such of liberty and democracy represent the imperialistic control of what the French philosopher Michel Foucault described as “governmentality”. He described it as the “art of government”, and represents “the ensemble formed by the institutions, procedures, analyses and reflections, the calculations and tactics that allow the exercise of this very specific albeit complex form of power, which has as its target population, as its principal form of knowledge political economy, and as its essential technical means apparatuses of security.” By controlling such factors in a government, Foucault’s argument means that the United States could gain hegemony on the ‘truth’ in the government, and thus gain power. He described the ‘truth’ as inherent in systems of power, coincides with hegemony, and which is culturally specific, inseparable from ideology. By controlling the ideological framework of a government, by shifting it into a liberal democracy along the lines of how the United States is framed, the United States could thus shape and therefore control the power of that government and gain hegemony and influential clout over it. The United States has, aside from the recent Bush-era endeavors to spread democracy, historically embarked in a sort of cultural imperialism. Following the Second World War it spearheaded the development of many of the norms of the ‘free, Western world’ to counter those of the Soviet Union. These norms included adherence to capitalist, market economies, a respect for international law and organization, and general liberal, democratic tendencies. From these norms developed institutions such as the International Monetary Fund, the World Bank, and the United Nations, which the United States has since used to coerce other, lesser powers into conforming to the capitalist, liberalized, democratic worldview. Aside from such power-politics, however, the transmutation from other societies ‘governmentality’ to those of our own across the globe is still the goal which we seek because our society tells us we have been ‘destined’ to do so. Our willingness to supplement the political norms of other societies with those of our own is demonstrative of our imperialistic societal view, and it happens that our possession of the capacity to enact this cultural imperialism enables us to do so. Little consideration is placed into the standing norms of those societies, and little conceptualization of cultural relativism explored. Again, our nationalistic tendencies make us perceive these other societies as inferior, and our ‘destiny’ dictates to us that they must be changed. Hence, because of our willingness to change these ‘inferior’ civic societies and that our capability enables us to do so, a sort of American cultural hegemony has appeared across the globe in the form of our civic values.

The permeation throughout much of the world of capitalism, liberal civics, and democracy has largely been the result of the United State’s efforts as a superpower in the past half century, and the United States, as the source of these values, can be claimed to hold cultural hegemony over the globe. Cultural hegemony is the manipulation of societal cultures so that it is imposed as the norm, and then is perceived as a universally valid ideology and status quo beneficial to all of society. In the terms of global society, liberal democracy, by which it is meant political representation and the right to personal freedoms and liberties, is now accepted as a universally valid right. Additionally, they have become status quo norms in the global society: societies and countries which rebuke liberal democracy are viewed as pariahs in the international community, such as Gaddafi’s Libya, North Korea, and Iran. Thus, the United States has enforced and pushed its own societal values across the globe so that they have become the international norm, and from this it has gained hegemony over the values which the international community subscribes to. The fact that the United States has been so successful in shaping the international society, and that it is the uniquely American values which have become the norm, lend support to the perception that the United States is therefore uniquely ‘exceptional’ and above all other states, include those which also adhere and subscribe to our societal values.

Challenges to the American hegemony on international societal values do arise, however, and in the present day and age the United States appears to be gradually slipping in its position as the unrivaled global power able to exert its force to enact its ‘destiny’. These challenges reveal an underlying xenophobia within the United States: a fear that other societies will be able to enact and enforce their societal norms upon the international community. Because we perceive our society to be ‘exceptional’, and therefore better than all of the rest, we are concerned with the prospect that a society of lesser ‘worth’ will shape the norms of the future. Nationalistic pride in American civics, as well as the belief that it is our ‘destiny’ to spread and convert the world to our style of liberal democracy, means that we are determined not to allow foreigner societies do what we have sought out to do. It is not, in our minds, their destiny to have. Perhaps this can explain why Mitt Romney accused President Obama of not believing in American exceptionalism: He was assaulting Obama’s commitment to the American ‘destiny’ while making subtle nods to this underlying xenophobia in American society.

American exceptionalism is thus a powerful and motivating experience in the American psyche, as is the worldview it espouses (one in which the United States is a unique, unrivaled state with a morally-superior purpose to spread liberal democratic values). However, this exceptionalism is a result of underlying worldviews and philosophical frameworks which operate within the United States, those of nationalism, ethnocentrism, cultural imperialism, a desire for an American-dominated international cultural hegemony, and an underlying xenophobia. While, in the perspective of an American who adheres to the American civic values and societal norms, the American ‘destiny’ to spread democracy and liberty is a good one, the underlying philosophical frameworks which result in this destiny and are a result of the perception of other societies because of this destiny lead to ethical quandaries.