Military

Scholars and practitioners in the social and behavioral sciences have studied military systems for over a century. Archaeologists and anthropologists have offered insights into the war making of prehistoric humankind as well as modern primary group dynamics (the ways soldiers develop their own methods of making sense of and appraising what they do) and the ways modern societies celebrate and memorialize warriors (Keeley 1996; Divale 1973; Mosse 1990; Ben-Ari 1998). Historians have explored the relationships between the social, economic, and geographic contexts of ancient and modern states and the military institutions they generated as well as the relationship between technological changes in weaponry and changes or the absence of change in the ways states and their military leaders prepare for and wage war (Mumford 1961; Mann 1986; Vagts 1959; Lynn 1984). Political scientists have delved into the relationships between political and military elites and the presence or absence of military coup making. (Finer 1988; Huntington 1957; Stepan 1971; Peri 1983; Karsten 1997; Feaver and Kohn 2001). Sociologists and social psychologists have asked how soldiers are recruited, trained, and motivated; how racial and gender integration is achieved; how morale is sustained or lost; what combat does to those who experience it; and how military institutions and personnel interact with the rest of society (Andrzejewski 1954; Janowitz 1960; Moskos et al. 2000; Stouffer 1949; Kindsvatter 2003; Gal and Mangelsdorff 1991; Cronin 1998). Geographers and economists have attempted to measure the costs and benefits of military institutions and warfare on regions, with their attendant impacts on domestic economies (Nef 1950; Melman 1985; Knorr 1956; Russett 1970; Rockoff 2005; Kirby 1992). Cultural studies/literary history scholars have mined the memoirs, poems, and novels of veterans (Wilson 1962; Fussell 1975; Fussell 1989; Lewis 1985).

For millennia organization for warfare has been one of the central activities of humankind. Military institutions, however simple they were in prehistoric times, were among the first social institutions. Bands of hunter-gatherers employed simple weapons and tactics in fights with other hunter-gatherers. The kinship-centered, cooperative propensities of earlier humankind coupled with the effective use of verbal communication offer better explanations of the combat effectiveness of early human communities than do studies relying solely on theories of aggressiveness (Bigelow 1969; Dawson 1986).

Simple, subsistence-level societies did not all wage war in the same manner. Resource availability and cultural differences among those societies were important factors in war making. The decision of some communities to house new couples in the wife’s mother’s community (matrilocality) as opposed to the husband’s father’s community (patrilocality) has been found to be associated strongly with a low level of local conflict as a result of the constant breaking up of extremely localistic war bands and a higher degree of effectiveness in longer-range warfare as a result of the creation of more cosmopolitan intercommunity trust and affiliation (Divale 1973).

Simple settled agricultural communities defended their fields with every-man-a-warrior militias. The first city-states tended to emerge in fertile alluvial valleys, where warlords dominated and fortified central market towns that had a surplus large enough to enable them to retain a professional military retinue. Those warlords centralized the acquisition and distribution of weapons, attacked and held nearby towns and cities, and extended their power into the pastoral hinterland. Virtually the entire budget of the first known warlord, Sargon of Akkad, the conqueror of Sumer, went to his army, but his military “pacification” also produced secure trade routes, law courts, uniform weights and measures, and a common coinage. (In later periods up to 70 percent of the budget would go to the military, as in the cases of the Roman Empire, Charlemagne, Edward III, and Louis XIV.) In the process those warlords developed symbiotic relations with agrarian and mercantile elites. “Civilization” had arrived (O’Connell 1995; Mumford 1961).

In the Middle Ages the armies of city-states and monarchies served elites who in several areas ruled relatively decentralized feudal communities dominated by aristocratic lords with their own armed retinues. Eventually most of the nominal kingly leaders of those feudal states, utilizing new military technologies and “standing armies” paid for in innovative ways, defeated their aristocratic rivals and claimed a monopoly on military violence. Many of those monarchic rulers later yielded to revolutionary forces employing conscript armies. Most of those military forces reverted in time to the modern model of “all-volunteer” forces, and the more affluent of those states developed increasingly sophisticated weapons, strategic planners, and logistic systems (Redlich 1964–1965; Black 1998; Tilly 1992; McNeill 1982).

The early modern state emerged in areas where monarchs were able to overcome the medieval constitutional traditions of sharing their power with a parliament composed of the gentry and the aristocracy. As commerce, riches from the New World, and more efficient European farming techniques generated economic surpluses, those resources were taxed for military purposes. Intendants loyal to the French monarchy slowly bled power from the nobility to fuel royal ambitions throughout the seventeenth century. King Gustavus Adolphus successfully conscripted Swedes for the seven armies he threw against the Hapsburgs in the second quarter of that century, paying for the effort with the sale of war bonds and monopolies, the appropriation of farms, the rationing of food, and the debasement of the currency, all accomplished by a ruthless bureaucracy. As Charles Tilly put it, “War made the State, and the State made war” (Tilly 1992, p. 213).

In the Netherlands sixteenth-century Calvinist wool manufacturers and merchants organized the first modern professional army. The forces of their Spanish foe had been raised in the venture-capitalism fashion of most early modern forces: The crown paid a fixed sum to professional military entrepreneurs to raise regiments. However, the primary remuneration for those men in the course of the campaign was understood to be booty under the maxim bellum se ipsum alet (“war should feed itself”). The Dutch force was conceived differently. Its mission was defensive and of indefinite tenure. Its commanders tried to avoid the chaotic behavior characteristic of looting soldiers in order to maintain discipline. Hence its men were paid regular salaries and enjoyed the benefits of a fledgling commissariat. Its employers included some of the world’s first assembly-line (woolen clothing) manufacturers. Thus it is not surprising that Dutch infantrymen were trained to present the enemy with a continuous and lethal series of musket volleys through the use of training manuals that offered a recruit dozens of by-the-number engravings of the steps that all the ranks of musketeers were to take simultaneously in a load-and-fire countermarch (Feld 1975).

Some technological innovations transformed both military methods and social and political structures. Bronze weapons were expensive. Thus Bronze Age armies were aristocratic, and their states were oligarchies. The advent of cheaper iron weapons meant that men of more modest means could bear arms. In ancient Greece this eventually resulted in more democratic polities. The stirrup enabled armored men to fight more effectively from horseback, but armor and large horses were expensive. Only an oligarchy could afford to field that type of force in Europe, Asia, or Africa.

By 1350, however, pikemen and crossbowmen had dealt the armored cavalry of feudalism devastating blows, and although the landed nobles resisted, their role as cavalry in military systems began to decline. The introduction of firearms into western Africa and Maori New Zealand significantly transformed the social and political structures of those peoples. In Europe and the Middle East firearms grew in significance as the rate and rainproof reliability of fire increased tenfold between the early sixteenth century and the late seventeenth century. By 1600 the ratio of infantry to cavalry in Europe had risen to almost 8 to 1. Military demands continued to influence and be aided by developments in the clothing industry, the metals trades, nautical technology, land transportation, and high finance (Vasillopulos 1995; Nef 1950; Van Creveld 1983).

The evolution and growth of military institutions appears to some to have followed a steady linear path from the simple to the sophisticated, but there were many exceptions. Indeed the European feudal and nineteenth-century Chinese warlord military systems were retrogressions from the more complex and effective armies of the Roman and Chinese imperial states that preceded them. Technological advances in warfare were not adopted readily by many military elites (Goldman 2006). Eighteenth-century and early nineteenth-century Mameluke warriors in Egypt clung to swordsmanship, sixteenth-century Japanese shoguns embraced firearms with great success but then abandoned and suppressed their use out of respect for the ethos of the elite class of samurai swordsmen, and medieval European aristocrats disdained improvements in infantry weapons and tactics for similar reasons (Stone 2004).

Armies grew in size as well as complexity over the course of several centuries, but those increases were not driven by technological breakthroughs. They came about only when political leaderships decided that such increases were appropriate. Empire builders such as Louis XIV and Phillip II increased the size of their armies, whereas leaders in Poland, Britain, and the United States held back. The leadership of the fledgling state of Israel produced a military with a high participation ratio because of its sense that Israel was beleaguered. Revolutionaries such as those in the French Committee of Public Safety and the Chinese Kuomintang and the Chinese Communists opted for mass armies for political purposes. Political leaders decide to add weapons and manpower at times of opportunity or threat; they also decide to reduce their expenditures and forces when that seems to be the right course. Examples include the Swedish decision in the eighteenth century to reduce the national military and the decision by the Chinese Empire in the sixteenth century to withdraw its massive navy from the Indian Ocean (Lynn 1990; Stone 2004; Vagts 1959; Perrin 1979).

Conscription of Frenchmen in the 1790s for the revolutionary infantry advanced the role of the common soldier. The conscription act called on those with new rights to satisfy new obligations. However, that massive and recurring mobilization did not lead to greater democracy. The musket had not “made the democrat,” in J. F. C. Fuller’s formulation (Fuller 1961, p. 33), in revolutionary France any more than it had in early modern Japan, Russia, or Prussia. Although it ended the battlefield supremacy of the samurai and the knight, they reemerged as the officer corps of the new standing armies.

Although the intensity of warfare and the military participation rates of male citizens both increased throughout the nineteenth and twentieth centuries, the share of both the gross national product and the resources of the state devoted to military expenditures began to decline in the late twentieth century. Social welfare and other nonmilitary lobbies grew more effective at the expense of the military-industrial complex. Civilian experts and technicians provided an increasing number of services to military institutions as the “tooth to tail” ratio of support personnel to combatants grew. Those in technical military specialties became increasingly vital to the maintenance and functioning of increasingly sophisticated military equipment (Wool 1968). Private contractors began to replace some military personnel. The ratio of American contract personnel to military personnel in the Gulf War was 1 to 60; by late 2006 in the war in Iraq it was virtually 1 to 1 (Hemingway 2006).

Military personnel have been recruited as volunteers or conscripted by the state. At different times and in different places volunteers have had a variety of reasons for offering themselves for service, and nations have employed a number of philosophical and technical approaches to the recruitment process, ranging from a total reliance on volunteerism to the most brutal sort of compulsion, with a host of intermediate formulas (Levi 1996).

In the absence of conscription, individuals have chosen to serve for monetary rewards or economic security, adventure or glory, and religious or political idealism. The soldiers of ancient Greece and Rome, those of medieval and early modern magnates, and those of more modern armies of empire were motivated largely by economic considerations; in fact some were foreign mercenaries. However, those economic motives could be intermingled with more culture-driven ones. Many Irish, Sikhs, and Gurkhas in the service of the nineteenth-century and early twentieth-century British armed forces, for example, conceived of themselves as people with a warrior tradition, a self-image that was not lost on British recruiters (Karsten 1983; Enloe 1980). Similarly the Crow, Pawnee, and Shoshoni braves who volunteered to serve as scouts for the U.S. Army in the 1870s felt both the push of tribal need and the pull of a warrior tradition (Dunlay 1982). Some members of the untouchable caste (harijan ) were recruited for British military service in India after the mutiny of elite Indian troops in the Bombay army in 1857. Untouchables saw military service as a vehicle for social mobility, and the thought of being used against Brahmins might have been appealing to some (Cohen 1990). Black Americans first volunteered largely for ideological reasons during the Civil War. Later many found military service to be a clear avenue for economic and social mobility, though they faced disappointment at the hands of racist recruiters and commanders until the second half of the twentieth century.

Within socioethnic communities that do not see themselves as warlike and in subcultures and families that are not impoverished, the individual act of volunteering in peacetime cannot be explained as easily. The first surge of wartime patriotic fervor in modern nation-states has led millions to enlist, but patriotic behavior also has been inspired by other motives. Many colonial New England recruits during the French and Indian War were younger sons who had not inherited land. Their response to offers of enlistment bounties consequently was informed by their desire to acquire a nest egg and personal independence from parental control. Most of those who served in the Continental army were more interested in the size of the bounty offered than in “the Cause” (Anderson 1984; Lender 1986). Conversely, many Confederate volunteers who rode with the guerrilla commander William Quantrill in western Missouri and Kansas during the Civil War were the eldest sons of substantial slave owners, defending their world against what they perceived to be a serious threat to its survival (Bowen 1977).

In any event patriotism alone does not explain why many have selected the military calling in peacetime in a host of historical periods. The spirit of adventure and the martial spirit notwithstanding, economic security clearly has been the primary motive for peacetime enlistments in voluntary military institutions (with the exception of officer candidate academies) (Karsten 1982).

When the question of recruitment is approached from the perspective of the recruiter, there are clear correlations between policy and sociopolitical structure. Mercantilist reasoning led several early modern European states to seek foreign paid volunteers (mercenaries) to keep their subjects employed productively on their farms and at their trades. Machiavelli argued for a militia drawn from both the propertied classes and the masses to defend the liberty of a city-state, but that reasoning did not impress most seventeenth- and eighteenth-century rulers and their bureaucracies. Thus in 1776 Adam Smith maintained in his Wealth of Nations that militias were inefficient. Such a system drew people from their fields and trades to train and failed to bring them up to the standards of professional soldiers (Smith 1937, pp. 653–668). The solution was the creation of a professional military.

The modern nation-state rediscovered the power of local and regional loyalties in recruiting volunteers. Great Britain reorganized its regiments in 1873 by basing one of its two battalions permanently in locales with which they thereafter would be identified. The usefulness of that step for both recruitment and morale was proved quickly, and in the first two years of World War I the massive British volunteer army was raised largely through the private actions of committees and individuals drawing on “local pride,” the “taproot of English nationality” (Simkins 1988, pp. 82, 97, 186). The National Guard Association of the United States, created in the 1870s, lobbied for volunteer units of the various states seeking resources from Congress. The regular U.S. Army, recognizing the recruiting and political power of the guard’s local roots, drew on the same source in the local basing of its army reserve units in the twentieth century. Early twentieth-century Japanese military planners used the strong social bonds of village life to reinforce motivation in organizing army reserve units (Smethurst 1974).

Volunteerism was not always sufficient for raising military forces. Consequently Britain was subjected to a conscription of sorts during the Seven Years’ War and the wars of the French Revolution and Empire. However, the English Militia Act of 1757 and its later English and Irish counterparts of the 1790s, like the American Union army drafts of 1863 and 1864, were designed essentially to spur enlistment by coaxing either service or the purchase of insurance to provide the required “commutation fee” or pay for a substitute. The conscription policies of other eighteenth- and nineteenth-century nations offered fewer options. The peasants of Russia, Hesse-Damstadt, Prussia, and some Latin American dictatorships were subjected to long terms of service.

Black Americans conscripted for segregated service in World Wars I and II faced both the fear and anger of southern whites and the distrust of white officers who regarded blacks as irresponsible and panicky. However, on the basis of their reading of massive surveys of the opinions of World War II soldiers, social psychologists advising the American military in the 1940s recommended the integration of black and white units to boost morale and improve performance levels. They were supported by white officer combat veterans who had developed respect for their men and had become confident in their ability, a phenomenon reminiscent of the experience of many white officers and their black troops in the Civil War. The integration of the services during the Korean War proved successful (Dalfiume 1969; Mandelbaum 1952). In the early 1960s the John F. Kennedy administration took the next step, requiring the desegregation of housing for military families near bases throughout the South.

On rare occasions women have been used as combatants, as in nineteenth-century Benin and early modern Japan, and as guerrillas by the Soviet Union in World War II. More often they have served as auxiliaries in support, clerical, and nursing roles. In the late twentieth century and the early twenty-first century the armed services in the United States have recruited women more aggressively for a wider range of tasks, including combat support. Simultaneously women have been admitted to the nation’s federal and state service academies. There was considerable resistance to this change, but academy leadership later began to crack down on sexual harassment and selective hazing (Alpern 1998; Sherrow 1996; Gelfand 2006).

The process of socializing military inductees into the service’s norms and mores while preparing them to perform their new duties always has had two dimensions: the goals and practices of the military and the impact of the process on the inductees. Certain features of the first dimension have been persistent and unmistakable. Discipline, collective action, the transmission of unit traditions, physical conditioning, and the acquisition of specific military skills always have been objectives of those responsible for the integration of recruits into military forces.

Modern boot camps are assumed by some social psychologists to be sophisticated versions of this process of reorienting individuals into the regimen and mores of the warrior culture with its male bonding. However, a study of U.S. Marine Corps basic training at Parris Island, South Carolina, in the 1940s and 1950s established the fact that marine officers for generations had felt it best to leave the process entirely in the hands of drill instructor sergeants (DIs), who trained the next generation of DIs without formal manuals or officer-led instruction. As one officer put it: “Probably it’s a good thing we don’t know how it’s done. If we knew, we might fiddlebitch and tinker with the process until we ruined it” (Fleming 1990, pp. 24–25; see also pp. 140–155).

The military always has reinforced training with disciplinary codes and leadership methods to ensure that missions are accomplished. Those codes and methods sometimes change, reflecting changes in the value system of the larger society or new demands within the military. The patterns of organizational authority in the modern military have changed since World War II. As the military became more technologically sophisticated, employing more specialists, the need to reenlist those specialists grew, but the specialists were often averse to arbitrary authority. Many former specialists indicated in the 1950s and 1960s that they had left the military because of its coercive ways (Wool 1968). Simultaneously soldier resistance movements, some developing into military unions, grew in the technologically advanced Western states in the 1960s and 1970s (Cortright and Watts 1991). Hence out of need, military elites slowly devised and provided less coercive forms of leadership than had prevailed before. The movement from coercion to persuasion accelerated in the United States when the draft was abolished in 1973 and the services had to rely entirely on volunteers.

From the time when the first group of hunters drew on their supportive habits to collaborate in a successful warband raid on their neighbors or in the defense of their village, the small-group camaraderie in military units has influenced the effectiveness of skirmishes, naval engagements, and pitched battles. Anything that disrupted that camaraderie was suspected of damaging military effectiveness. French revolutionary leaders in the Committee of Public Safety knew how to organize small squads of about fifteen men. When those men received their portions of stew in the evening, they often were provided with revolutionary broadsides or songs that they were expected to learn by the evening campfire. French revolutionaries understood the importance of patriotic fervor as well as what modern sociologists call the primary group (Lynn 1990). That induced bonding generally was successful. “A new comradeship and unity blossomed in our young lives,” Emlyn Davies recalled of his early days in the Seventeenth Royal Welsh Fusiliers in 1914 (Simkins 1988, p. 302). The Canadian major George Pearkes wrote home in 1917 that “it always seems to me that I’m not fighting for King and Country but just for [my] company, which seems to be everything to me these days” (Pearkes).

In the 1950s U.S. Army researchers concluded that in battles between German and American units in World War II the German units generally appeared to have bested comparable American groups. In 1983 Martin van Creveld argued that this was the case in part because American policies with regard to unit formation and casualty replacement practices resulted in a fighting force with lower small group cohesion and trust than German units, in which cohesion was the conscious objective of commanders. Other research offers an additional explanation for German morale: the strength and depth of Nazi ideology and indoctrination (Van Creveld 1983; Bartov 1991).

Many Americans entered Vietnam with confidence in the rightness of their cause and the effectiveness of their weapons and leaders. That confidence often was reduced after months of heavy combat in steaming-hot terrain to what one veteran called “a war waged for survival in which each soldier fought for his own life and the lives of the men beside him, not caring who we killed … or how many or in what manner” (Lewis 1985, p. 118). Their plight was made more perilous by the high command’s practice of cycling career officers through brief combat command tours of duty.

As the rate, range, and lethality of fire and the duration of exposure to it rose over five centuries, combatants experienced increasing stress (Keegan 1976). After prolonged periods of combat, the din of battle and the sight of dying friends produced “the shakes” and other symptoms of mental distress in many soldiers. In World War I their reaction was called shell shock; in World War II, battle fatigue. This phenomenon appears to have affected men in the American Civil War as well (Dean 1997).

The increasing lethality of combat might have been expected to lead to greater unwillingness to respond to orders under fire. However, although there is clear evidence of this among French forces in World War I (Smith 1994) and some evidence in other armies in the twentieth century, most troops have obeyed orders that placed them in “the killing zone.” Most mutinies involve matters of pay or living and working conditions (Bushnell 1985; Rose 1982). Many combat veterans who suffered posttraumatic stress disorder (PTSD) long after their years of service owed their distress to the trauma of combat, but not every veteran of heavy combat became a victim of PTSD (Card 1985).

Military service has both temporary and long-term effects. Some who appear to have been transformed by the experience are better understood to have possessed those propensities before entering the service or to have entered the service with traits or personalities that made them especially prone to experience the change. West German recruits who were given an “authoritarianism” questionnaire before entering basic training, again after completing eighteen months of service, and again two years after the completion of their service were found to have undergone a decline in their level of authoritarianism while experiencing it firsthand in the Bundeswehr. However, they then drifted back to the original higher level after they had put that experience behind them (Roghmann and Sodeur 1972). The process of self-selection into American airborne training and Green Beret service as a result of already possessed values proved to be more important than the training or duty assignments afterward in explaining post-training or post-service attitudes and values (Cockerham 1973; Mantell 1975). Thus the impact of training and efforts on transforming attitudes can be overstated. Militarization, if and when it occurs, often has been confused with the reinforcement of established values.

In some modern cases mobility opportunities in subsequent occupations improved as a result of military service, as was the case for minorities in the U.S. military in the 1950s and afterward (Browning, Lopreato, and Poston 1973). One’s perspective on the world could be altered as well. Certain American Revolutionary War soldiers seem to have experienced a change in political perspective. Officers who served outside their own states tended to adopt more cosmopolitan political positions after the Revolution, as did some enlisted men. Others who had not left their state but were similar in age, nativity, religion, social class, and county affiliation (the sum total of these characteristics constitutes “background”) to those who had left their state exhibited no such change. One group had seen more of the Confederation and its plight and had seen the need for stronger bonds in the form of a new Constitution (Benton 1964; Burrows 1974). Similarly French soldiers who had served in America during the war were more actively involved than others in attacks on the homes and records of French nobility during the early stages of their revolution (McDonald 1951). Service in the Prussian/German armies and navies appears to have made militarists of many veterans (Ward and Diehl 1975). In analyzing the interactions of the military and society, future scholars will continue to ask how military service may have affected those who served as well as how some military institutions have affected the societies they belonged to whereas others simply have reflected those societies.

Burroughs, Edwin. 1974. Military Experience and the Origins of Federalism. In Aspects of Early New York Society and Politics, eds. Jacob Judd and Irwin H. Polishook, 83–92. Tarrytown, NY: Sleepy Hollow Restorations.

Karsten, Peter. 1983. Irish Soldiers in the British Army, 1792–1922: Suborned or Subordinate? Journal of Social History 17: 31–64.

Karsten, Peter. 1997. The Coup d’Etat and Control of the Military in Competitive Democracies. In To Sheathe the Sword: Civil-Military Relations in the Quest for Democracy, ed. John Lovell, 149–163. Westport, CT: Greenwood.

Keegan, John. 1976. The Face of Battle. London: J. Cape.

Keeley, Lawrence H. 1996. War before Civilization. New York: Oxford University Press.

Kindsvatter, Peter S. 2003. American Soldiers: Ground Combat in the World Wars, Korea, and Vietnam. Lawrence: University Press of Kansas.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

Military

Military

As a sociological category, the term “military” implies an acceptance of organized violence as a legitimate means for realizing social objectives. Military organizations, it follows, are structures for the coordination of activities meant to ensure victory on the battlefield. In modern times these structures have increasingly taken the form of permanent establishments maintained in peacetime for the eventuality of armed conflict and managed by a professional military. Accordingly, the military professional is an officer who pursues a lifetime occupational career of service in the armed forces, where, to qualify as a professional, he must acquire the expertise necessary to help manage the permanent military establishment during periods of peace and to take part in the direction of military operations if war should break out. Career commitment and expertise, the hallmarks of any professional, set the professional military officer apart from those other personnel in the armed services who are merely carrying out a contractual or obligatory tour of duty or for whom officer status primarily represents, as it often did in former times, an honorific pastime into which military skill enters only as a secondary consideration.

Throughout most of history the right to employ violence has been derived from membership in a special community or in one of its status groups. While societies everywhere have always regarded outsiders as legitimate targets for violence, societies whose internal relations are based on physical dominance of one group by another allow for fewer of the fine distinctions between brute force and other bases of social and political power. Thus the Roman legions both served imperial ambitions and became the major domestic political force. Indeed, war lords of all epochs have considered their armies a form of private property and have used them to secure their tax base and to extend its boundaries. Under such conditions, the internal organization of the armed forces closely reflects the distribution of power within the society at large. In general, the more pervasive the prospect of violence against external or internal enemies of the regime, the more similar are the military and the civilian value hierarchies.

The concept of the military as a permanent establishment maintained solely in support of foreign policy objectives presupposes the development of a civil society based on consensus. In such a society, the armed forces are called upon to cope with domestic disorder only in extraordinary circumstances, this task being relegated largely to civilian police forces. However, the incapacity of party governments to resolve vexing internal problems, including an inability to mobilize the “home front” in support of national goals, has on many occasions led the military to do more than provide coercive power for use against external enemies* Their role in this regard has been especially important in those newly emerging nations whose civil institutions and sense of national identity have not yet had sufficient time to develop.

Professionalization of the military, with rank and authority granted on the basis of demonstrated competence rather than status, cannot evolve until the problem of military management has become separated from and subordinated to the more general problem of governing a society, Even so successful an innovator of strategy as Frederick n of Prussia, because he wanted to ensure the personal loyalty of his officers, insisted that they be drawn exclusively from the ranks of the aristocracy, Using this kind of power base inside the officer corps, the postfeudal nobility of many a European country was able to prolong its waning influence. It did so by preserving within the military certain archaic sentiments, ceremonial practices, and ideological beliefs that supported the social superiority of officers, and then proclaiming this superiority valid for the society as a whole (Vagts 1937). Militarism of this sort, because it hindered rather than helped the growth of expertise, was a major impediment to the professionalization called for by advances in military technology.

The possibility that certain strata of the society might use their monopoly of armed force to gain a disproportionate share of the values available within a society helps to explain why the right to bear arms has so frequently been declared one of the inalienable rights of a free citizenry. Such a right, when it becomes the prevailing military doctrine, may stand in the way of military efficiency. In France the governments of the Third Republic, intent on curbing any possible political ambitions of military leaders, insisted on a short-term conscript force, and by this manpower policy they deprived the French army of the opportunity to develop a highly trained force. The doctrine of the “nation in arms” in this French version helped seal that country’s fate when it had to confront better-trained and numerically superior German forces in two world wars.

Under modern conditions, lasting victory in war certainly can no longer—if it ever could—be achieved primarily through the sheer weight of a hastily assembled mass of manpower acting either under the command of their social superiors or of a very small professional cadre. Furthermore, a decided advantage now goes to the belligerent with the industrial and scientific base for developing more powerful weapon systems and with a labor force containing sufficient skilled personnel to maintain, repair, and replenish the products of military technology during hostilities. As research, development, technical maintenance, and the organization of logistic support become more important elements in strategic planning, military managers are forced to pay more attention to the implications of economic, social, and political policy for the state of military preparedness. Hence, the events to which they must be responsive have increasingly more to do with scientific-technical capabilities and sociopolitical forces, and proportionately less to do with direct encounters on a clearly delimited battlefield. The traditional wall of separation between strategy (the explicit domain of the professional military) and policy (the explicit responsibility of civil government) comes to be breached at many points.

This shift in perspective has been reinforced by the growing emphasis on deterrence, rather than counterviolence, as the major strategic goal of the nuclear powers. But similar tendencies are evident in the industrially backward nations, whose military leaders recognize that they must create an indigenous economic and educational base as a major condition for a complex military establishment. Their efforts to achieve this frequently propel them into major roles in the modernization of their countries. In general, where civilian agencies lack expertise or legal and political precedents for containing the military, the military can usurp the highest counsels not only through deliberate infiltration but also through lack of any opposing polical force.

The rapidity of technological progress in modern times often forces the abandonment of a whole weapon system before it can be operationally tested in battle. Thus there exist, within the military establishment, installations whose express function is to create and develop new and unorthodox concepts and procedures, including the application of computerized simulation techniques to the solution of strategic problems. In some ways, therefore, the military establishment begins to conform more to the pattern of a laboratory for testing the concepts and “hardware” underlying a new system, and relatively less to the pattern of a striking force whose permanent components are designed primarily to provide a basic framework for expansion in case of need. Here, again, the traditional boundaries between the civilian expert and the military professional tend to become blurred, in particular as the influence of the civilian expert ceases to be confined to the design and development of basic military “hardware” but begins to extend to matters related to its application. Civilian specialists have carried major responsibility for the introduction into the military of new managerial methods and training devices based on scientific evaluation rather than on traditional concepts. Perhaps the most significant development, however, is to be found in the number of civilians, either in the direct employ of the military or under contract, who have come to play major innovating roles that extend to the domain of strategy, which is traditionally a military preserve.

The impact of modern technology notwithstanding, all military organizations continue to operate within a context of considerable uncertainty. The authority structure, work routines, and conceptions of discipline in the armed forces must be geared to the ever-present possibility, no matter how remote, that every member of the organization, what-ever his job classification, may be called upon to perform his normal duties under battle conditions. Many specifically military practices explicitly express, or have as a latent point of reference, a concern about the capacity to make an adequate response under stress. The anxiety-reducing function of many routines is especially evident in the persistence of archaic ceremonial practices that have no apparent functional utility. These probably help instill confidence that, in the event of a crisis, officers and men at all levels of the organization will conduct themselves in a predictable manner.

Active warfare, moreover, is a highly seasonal occurrence that alternates with more or less prolonged periods of peace. As in the past, the military man must indulge in a certain amount of romanticism to justify his continuing dedication to the martial arts when no apparent need for them exists. Acclamation of selfless service to one’s country as an ennobling ideal for all, emphasis on the manly virtues, and the sense of corporate eliteness implicit in these ideals have been basic ingredients of military esprit de corps. Thus, the tenacity with which European armies resisted motorizing their horse cavalry, even after its inutility in war had been fully demonstrated, derived not only from an aristocratic tradition, symbolized by the officer and his mount, but probably drew added vigor from a reluctance to countenance the replacement of heroic men by mechanized components.

By the same token, the massive resistance of military traditionalists to the formation of a separate air arm, to the introduction of the aircraft carrier and the atom-powered submarine as strategic naval weapons, to the replacement of manned bombers by missiles, and so forth, contains elements of a defensiveness that seems to be characteristic of the military, although it reflects rigidities and vested interests of a kind likely to develop in any large and complex organization. But military doctrines, in particular, are codifications of experiences gained in the past—experiences that are forever being reanalyzed. Since doctrinal modifications in time of peace cannot immediately be tested against new experiences, the remote advantages of change must inevitably be balanced against the confusion and uncertainty that attend reorientation of any sort. This organizational dilemma has been especially aggravated by acceleration of obsolescence. In this process, the reduction of “lag time” (the interval between the time a system becomes operationally feasible and its full acceptance by officers ) becomes as important as the “lead time” between the drawing board and the operational stage. The concern about remaining up-to-date creates a real danger of innovation for its own sake rather than as a rational adaptation to changed circumstances. In turning toward science as a source of new ideas, the military may, under the guise of modernity, be searching for the same kind of procedural solutions that it once embraced because they were traditional. Reliance by the armed services in their internal management on highly rationalized procedures and computerized systems diverts some of the uncertainties inherent in the possibility of military failure into a quest for internal order. Scientific innovation, especially when the assumptions behind its adoption are not constantly tested by experience, can degenerate into an obsession with the latest gadgetry, as divorced from reality as the prescientific forms of ceremonialism. Similarly, techniques as unorthodox, from a military point of view, as political warfare and counterinsurgency do not necessarily encourage objective evaluation of the limitations that political and social forces impose on the value of a strictly military success. To some enthusiasts within the military these techniques often appear merely as more effective alternatives to annihilation

Another effect of technological change has been to undermine the military profession’s insularity, once the almost inevitable consequence of faraway missions, assignments at isolated posts, or duties on board ship, all of which tended to deprive officers of social contact outside their narrow professional world. Many tasks with which military personnel nowadays must cope as a matter of routine are only indirectly related to combat. Modern technology has so transformed the conditions of wartime service that to maintain a single soldier in combat takes many more men than it did when the martial arts were at a more primitive level. It follows that the most rapidly expanding military job categories are generally those involving scientific, technical, and administrative skills—categories for which there are near equivalents in the civilian economy. As a result, the experience gained during military service acquires transfer value for a subsequent career in civilian life, where these same skills are likewise in demand. In order to retain skilled personnel in military service beyond their obligatory tour, the armed forces must try to offer inducements comparable to those in alternative civilian employment.

Recruitment of officers

The traditional, ascriptive pattern of recruitment, especially the time-honored practices of giving preference to sons of officers in the selection of applicants to officer schools, and of fostering among candidates and junior officers a unique professional culture, was calculated to discourage all those not highly motivated toward an officer career. But higher skill requirements have more recently led to a wider search for talent and have opened new opportunities for social ascent to many ambitious young men of relatively modest origin. Sons of enlisted men, once likely to have been disqualified from officer ship on purely social grounds, are no longer at this great disadvantage. In France, their proportion among new officers has nearly tripled since World War n (Girardet 1964, pp. 38 ff.). There has been a similar broadening of the officer recruitment base, mostly under the impact of technology, in nearly every country. The proportion of the officer corps recruited from aristocratic and plutocratic elements has dropped off even more abruptly as political purges—especially in the Soviet Union, Germany, and many Latin American countries—have forced the separation or premature retirement of officers too closely identified with discredited political regimes.

Despite this general trend toward more representative recruitment, there are still many sons of officers who follow their fathers in choosing a military career and so to some extent maintain the social continuity of the occupation, For example, as opportunities for advancement were sharply curtailed in the contracted German army of the 1920s, the proportion of new officers who were sons of officers increased considerably. In the United States during the two decades after World War n, the proportion of “second generation” officers entering West Point remained at a nearly constant level of somewhat above 20 per cent (Janowitz 1964, p. 135).

The significance of this occupational continuity is debatable; as in any occupation, the amount of intergenerational mobility depends in part on changes within the entire occupational structure. It may be noted that in 1950 about 20 per cent of practicing lawyers were sons of lawyers, law having the highest amount of intergenerational continuity of any occupation in the United States (Warkov 1965, p. 43). But graduates of the major military schools, such as West Point, Sandhurst, and St. Cyr, have had, as a rule, stronger commitments to a military career and, partly for that reason, have contributed disproportionately to the higher officer ranks and leadership positions. Anticipatory socialization early in their family life, together with the experiences and contacts made in the academy, gives these officers a competitive advantage over others recruited directly from civilian life. Hence a hard core of traditionally military families, where they exist, probably exerts greater influence than is indicated by gross figures on occupational continuity. Even in the United States, where such families have not been especially conspicuous, intergenerational continuity of occupation among top military executives seems to have been greater than among their civilian counterparts in the federal government (Warner et al. 1963, chapter 2).

A significant ambiguity results from the fact that the officer corps is both a profession requiring the acquisition of certain skills and a corporate body through whose rank hierarchy each officer must advance. Many officers who have acquired educational and other professional skills of use to the military are not professional military men. In fact, increases in officer allocations in recent years reflect in large measure the need for men qualified to take responsibility for complex equipment and to perform certain technical and administrative functions. While some old-fashioned armies, in order to provide positions for sons of the privileged classes, have had unusually high allocations of officers, in modern armies the increases have been greatest in branches with the most advanced technology and at levels of responsibility—usually intermediate ones—where experienced men with technical qualifications must be promoted in order to be retained. However, the authority of many officer specialists is severely limited, and in some instances their specific designation precludes advancement into positions of major responsibility. Also, the frequency with which officers in many specialties transfer out of the armed service into civilian employment indicates a primary commitment to a specialty that takes precedence over any commitment to the military.

Hierarchy of command

The implications of the diversification of skills and specialties extend beyond the character of officership as a profession. Diversification also affects the internal authority structure of military organization. Traditionally, military authority has been both hierarchical and collegial. On the one hand, military discipline prescribes unquestioning compliance with orders passed down through an unambiguous line-of-command authority, with only the details of implementation left to the discretion of individual commanding officers. On the other hand, military discipline means more than automatic compliance: it subsumes the imperative, binding on every officer, to inspire one’s subordinates by personal example and to cultivate among all officers a strong respect for professional norms. The presence of specialists injects the element of technical knowledge into these authority relationships.

One source of strain stems from the fact that many unit commanders, even at the lower echelons, lack the technical knowledge necessary to direct all the diverse components for which they shoulder formal responsibility. Nor do they have officers on their staffs with sufficient knowledge. If such knowledge is not available at the level of the unit to which an individual officer is assigned, he has a strong incentive, when difficulties of a purely technical nature arise, to solicit information and advice directly from technical specialists attached to higher staffs. This enables him to resolve many routine difficulties while avoiding formal command channels and without involving his commanding officer in the details of every operational problem. However, commanding officers who tolerate such informal trouble-shooting activities, which clearly deviate from prescribed procedures, run the risk of teaching disrespect for the chain of command. In fact, they may inadvertently be discouraging their officers from keeping them fully informed on all matters under their own command.

Another source of strain is that many functions and policies come under the central administration of a staff from which detailed directives emanate. These directives often leave little leeway to a local commander and may actually usurp some of his traditional prerogatives. Staff officers, by definition, have no command authority in their own name, but only as delegated. Yet relatively junior officers can, and sometimes do, informally exercise a considerable amount of de facto authority, simply by virtue of the esoteric wisdom with which their position on the higher staff endows them. The hypertrophy of this kind of staff authority was reached by the Germany army in World War i, where general staff officers, in control of their own network of communication, came to issue orders that at times completely countermanded directives from commanding generals whom they formally served only as staff advisers. This development was evidently fostered by the past practice of favoring members of royal houses for command positions, the consequences of which staff intervention was intended to redress. Nevertheless, central direction, even when it accords with the best technical principles, tends toward the creation of a dual system of authority and inevitably generates some anxiety among unit commanders about what authority they actually have. The desire of commanding officers to retain firm personal control even over matters that are centrally directed can be seen in the frequency with which they use whatever discretion they have in implementing a centrally issued directive in such a manner as to subvert its intended purpose.

Strain also arises from the need to coordinate the activities of lower-echelon individuals or units that are components of different hierarchies. When the recognition of a work relationship does not in itself induce spontaneous collaboration, there can be considerable concern over limits of competence and of formal authority. Another version of this problem exists in the staffs of supernational forces, where the separate military hierarchies represented are associated with disparities of national power. Staff cooperation in NATO headquarters is said to have suffered considerably from the capacity of some officers to compensate for any lack of rank or formal authority by making use of the power of the nation they represented. Certainly, U.S. advisors in Vietnam were often able to use their country’s control over certain weapons to gain compliance with their decisions, even when they were clearly outranked by their Vietnamese counterparts. Still a third and somewhat analogous version of this conflict, based on ideological disparity, has occurred between professional military officers and political commissars. Where the latter represent the political regime at the unit level, and are therefore assured of outside political support, they are often in a position to countermand certain orders of their nominal superiors if they wish to do so.

No authority structure can by itself ensure spontaneous cooperation under battle conditions, where confusion is inevitable and improvisation and unorthodox solutions are frequently called for. The makeshift character of front-line living arrangements inevitably gives rise to serious deviations from procedures learned in the training camp. A great deal has in fact been written on the displacement of motives to the immediate group: the “comrades” or “buddies” with whom each soldier shares the experience (Grinker & Spiegel 1945; Stouffer et al. 1949, vol. 2; Mandelbaum 1952; Janowitz 1964, pp. 195–224). This sense of solidarity, which in some ways extends to all combat men, usually engenders strongly deprecatory attitudes toward those echelons of lesser risk from which most regulations emanate. To the degree, therefore, that individual and interpersonal motives become determining factors, military units in combat tend to assume some of the characteristics of a primitive mass formation. The capacity of this formation to absorb stress is highly contingent on the strength of shared sentiments. Where organizational authority does not enjoy legitimacy, strong sentiments of this sort can facilitate the rapid spread of deviant tendencies. However, the prevalence of a sense of generalized obligation lends legitimacy to punitive discipline when it is invoked as a last resort.

The unavoidable presence of physical risk is a major source of disruption in combat units. Detailed investigations of the behavior of ground combat soldiers have convincingly documented the reluctance of many riflemen to discharge their weapons against a visible enemy target: during any single encounter, only a minority were found to have fired, irrespective of the chances the men had to engage the enemy (Marshall 1947; for the sources of the following remarks on reaction to combat stress, see Janowitz 1959, chapter 4; Lang 1965 a). Evidently success in such encounters does not depend on the performance of every individual. Indeed, among U.S. interceptor pilots serving in Korea, only a small minority of aces accounted for an overwhelming majority of all enemy planes shot down, and most fliers were not even credited with a single plane. Air superiority was nevertheless maintained.

Containment of deviance

The old-fashioned practice of severely disciplining some men for “cowardice” to deter others from failing in their duty under fire at best promotes token compliance when opportunities for escape are lacking. As a means of instilling the motivation necessary for superior performance, it is hardly effective. Yet, under conditions of modern warfare, much depends on the initiative displayed by individuals operating in small units relatively removed from the influence of formal control. The problem under these conditions is how to contain deviance within certain tolerable limits so that it does not disrupt organizational effectiveness. Even in the normal engagement many men will not measure up to par. Since enemy fire causes casualties, rates of desertion, dereliction from duty, psychoneurotic break-down, and other forms of deviance invariably begin to rise either after a prolonged stretch of uninterrupted combat or after an engagement in which a unit suffers particularly heavy losses. In these circumstances, any break in efficiency has cumulative implications because it tends to impair the motivations and efficiency of other men.

The major role in the control of deviance is increasingly being assigned to the medical specialist. In acknowledging anxiety in battle to be a natural and normal reaction, military psychiatry in general, but American and British military psychiatry in particular, has gone a long way toward treating its disruptive effects on behavior as primarily a medical and only secondarily a disciplinary problem. Although the literature provides regrettably few studies that permit reliable historical or cross-national comparisons, prevailing psychiatric theories certainly suggest that the reliance on rigid disciplinary controls would produce more lasting mental damage, with chances for ultimate recovery much diminished. In World War i, the number of severely impaired shell shock cases was certainly greater than in World War n, with its more enlightened practices of military medicine. Japanese soldiers in World War n, subject to the most unyielding discipline and supposedly indifferent to their own survival, seemed especially prone to severe attacks of hysteria. The possibility of culturally patterned reactions expressing differences in national character, especially tolerance for anxiety, cannot, of course, be ruled out. Yet the apparent severity of the reactions among Japanese troops may have been provoked by the strong sanctions against the open expression of anxiety in any form.

Organizational correlates of breakdown

The availability of a legitimate medical evacuation channel has important implications for organizational behavior. Thus, some psychiatrists have pointed out that a collective belief among American troops in World War II in an objective “breaking point” beyond which a person could not go on may actually have contributed to an increase in the number of psychiatric breakdown cases who requested evacuation because of a typical symptomatology. Neuropsychiatric breakdown was far less frequent among British troops in north Africa, who, unlike the Americans, had no expectations of being permanently repatriated until the end of the war, but whose combat was interrupted by frequent periods of rest. Similarly, there were no neuropsychiatric casualties among South Korean troops until after their integration with American forces, when these same evacuation channels became available to them. Yet they had previously exhibited many other kinds of ineffectiveness.

The point is that evacuation statistics reflect not only psychiatric malaise but also a complex decision process. For the soldier who has had enough, the use of the evacuation channel with the approval of a psychiatrist offers a legitimate alternative to self-mutilation, letting himself be taken prisoner, temporary desertion, and other forms of escape. Thus, during the rapid retreat by U.S. troops from their advanced positions on the Yalu River during the winter of 1950/1951—a period of evident stress—the neuropsychiatric casualty rate exhibited a marked decline. Soldiers could not rely on evacuation, for all medical facilities were severely overtaxed. Even when ready to give up, they had a strong incentive to remain with their unit simply to avoid being captured or killed. Similarly, desertion, which had been a major problem in Europe with major cities nearby, was practically nonexistent among American troops engaged in island-hopping operations against Japan.

Although comparable data from other nations are not available, it is clear that the American combat soldier in World War II was inclined to take a rather lenient view of temporary desertion, consistent with his generally tolerant attitude toward a man who was suffering from symptoms of fear which he had made a genuine effort to control. One distinguishing characteristic of men who became neuropsychiatric casualties was their tendency, on the average, not to entertain favorable attitudes toward the less legitimate forms of escape provided by unauthorized absence from the battlefield. Conversely, many men guilty of desertion in combat left their units only after they had on one or more previous occasions been refused medical evacuation. There are indeed indications that the two forms of escape are in some respects interchangeable and also that the decision on whether a man is entitled to medical evacuation or should be returned to his unit is not only medical but nearly always involves judgments based on organizational criteria. The effect any disposition may have on the morale of the remaining men can rarely be kept from intruding into such decisions. If the tactical situation permits, a man’s prior record of good performance can earn him evacuation for symptoms that might send another man back to the front. Particularly, officers and noncommissioned officers who carry responsibility for other men, whose safety might be jeopardized by their continued presence, are more readily evacuated (the technical reports on which the foregoing remarks are based can be found summarized in Janowitz 1959; Lang 1965 a).

Units in combat are undergoing a constant process of attrition and replenishment, as evidenced by the continuous turnover in personnel. But the maintenance of logistic and organizational support is probably more important for maintaining the effectiveness of larger units than is keeping a particular man in battle, especially if he is suffering from evident symptoms of stress. Viewed in this context, military psychiatry as practiced nowadays reflects the same shift in orientation toward warfare that is often noted in connection with strategic planning: the long-term conservation and management of national resources and talent has become a more important military asset than victory in almost any local encounter. Again the picture of the whole world as a potential battlefield and of the possible involvement of whole populations is reflected in practices that reach into the lower units.

Understanding of combat goals is clearly essential to understanding the military and its organizational practices. Yet the battlefield itself is undergoing change, and the specific missions assigned to the military are changing with it. The new forms of warfare, including ideological war and nuclear deterrence, lead to new priorities in the mobilization of men and resources. Hence, both the relationship between the armed forces and society and the internal structure of the military will undergo further change.

Kurt Lang

[Directly related are the entries Civil-military relations; Internment and custody; Militarism; Military policy; Military psychology; National security; War.Other relevant material may be found in Economics of defense; Intelligence, political and military; Military law; Military power potential; Science,article on Science-government relations; Strategy;and in the biographies of Clausewitz; Douhet; Mahan.]

Johnson, John J. (editor) 1962 The Role of the Military in Underdeveloped Countries. Princeton Univ. Press. → Papers presented at a conference sponsored by the RAND Corporation at Santa Monica, Calif., in August 1959.

Lang, Kurt 1965 a Military Organizations. Pages 838–878 in James G. March (editor), Handbook of Organizations. Chicago: Rand McNally.

Millis, Walter; Mansfield, Harvey C.; and Stein, Harold 1958 Arms and the State: Civil–Military Elements in National Policy. New York: Twentieth Century Fund.

Stouffer, Samuel A. et al. 1949 The American Soldier. Studies in Social Psychology in World War II, Vols. 1 and 2. Princeton Univ. Press. → Volume 1: Adjustment During Army Life. Volume 2: Combat and Its Aftermath.

Warner, W. Lloyd et al. 1963 The American Federal Executive: A Study of the Social and Personal Characteristics of the Civilian and Military Leaders of the United States Federal Government. New Haven: Yale Univ. Press.

Cite this article Pick a style below, and copy the text for your bibliography.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

Military

Europe, 1450 to 1789: Encyclopedia of the Early Modern World
COPYRIGHT 2004 The Gale Group Inc.

MILITARY

Early modern military engineering co-evolved with the siege tactics that characterized European warfare from the late fifteenth to the mid-eighteenth centuries. By 1530 the assimilation of heavy gunpowder weapons was matched by the development of fortifications that could withstand cannonball bombardment. Campaigns usually focused on the taking of a city, although an aggressor's single most potent tactic was often to starve the inhabitants. Early modern siege warfare, precisely because of its relatively static, game-like quality, offered a broad stage for the activities of the engineer. Opportunities abounded for engineers who could maximize the capabilities of machines and gunpowder, effectively organize the immense workforce of trench diggers, ease the enormous burden of siege train baggage on campaign, or design an "impregnable" fortress in peacetime. As military engineers sought to define a science at the core of their new profession, the sphere of military engineering opened up an avenue of advancement both for men and for ideas about how the world of resisting walls and projectiles—matter and motion—worked.

THE NEW WEAPONS

Gunpowder weapons were known to Europe by the 1320s. The earliest "cannons" were usually large barrel- or pot-like receptacles made of forged metal, mounted on a cumbersome cart and charged with irregular balls or projectiles. By 1500 most of the innovations that were to determine the form of muzzle-loaded cannons had been introduced. Cannons were cast of bronze (and, shortly thereafter, iron) to specific lengths and calibers. These ranged from the very smallest falconet, at a barrel length of six feet and a caliber of just over two inches, to long slender culverins, to heavy four-ton cannons. (Mortars and, later, howitzers were also cast.) They were then mounted on specialized carriages on pivots (trunnions) that were placed at standardized distances from the rear of the cannon. Indeed, the invention of standardized trunnions, with the increased ease of aim and accuracy they allowed, has been credited as the secret behind the terrifying reputation of Charles VIII's artillery when in 1494 the French monarch swept through Italy from the Alpine border to Naples.

Even given the impressive advances of the sixteenth-century cannon over its precursors, cannons still presented numerous difficulties that added to the inherent unpredictability of warfare. Each cannon was unique, owing to inconsistencies in metallurgy, boring, and other factors of its production. Cannons shot differently, depending on the gunpowder and how hot they became. They might crack in battle or, worse, explode prematurely if they were handled improperly. The heaviest bombards required dozens of draft animals to haul them; legions of men, employed to maneuver and plant cannons, attended the artillery train.

Innovations in the design of ordnance that might ameliorate these conditions were usually owed to gun makers. Members of the Alberghetti family, for example, requested numerous patents over the generations in which they headed the foundry at the Venetian arsenal. The single greatest improvement to the cannon was effected by the boring machine invented by Jean Maritz (1680–1743) in the mid-eighteenth century. The cannon barrel was rotated by a machine powered by horses, while a bit was advanced into the front of the piece. Before this time, cannons were each cast in a unique mold with an earthen core to make the hollow. The hollow tube was then smoothed on a vertical reaming machine. The boring machine allowed many cannons to be cast from the same mold, thereby helping to standardize shots among cannons. Moreover, because the bore could more precisely fit the size of the cannonball, it nearly halved the space between the inside wall of the barrel and the cannonball moving through it (windage). This greatly increased accuracy and power.

MILITARY ARCHITECTURE

While a number of gunfounders, or their sons, became military engineers, the profession was much more rooted to the tasks of the Renaissance city architect. Architects had traditionally acted as the designers of fortifications and military machinery. Filippo Brunelleschi (1377–1446) had to take time off from the construction of the Duomo in Florence in order to follow troops at war with the nearby city of Lucca. Architect, engineer, painter, and sculptor Francesco di Giorgio (1439–1501) is credited with
the development of one of the most important innovations in defensive architecture, the angled bastion on which effective defensive fire could be mounted; Michelangelo (1475–1564) further developed its offensive capacity. Among the most active workshops in fortifications design were those of Antonio da San Gallo the Younger (1485–1546) and Michele Sanmicheli (1484–1559).

In the context of the decades-long Italian wars (1494–1559), in which huge armies and their siege trains battered Italy, the style of fortification that would dominate continental European warfare for the next two centuries emerged. Italian architects developed the main features of the trace Italienne, a polygonal circuit of walls with spade-like bastions built at each angle, by the early sixteenth century. The tall, crenellated walls of medieval fortifications had offered little resistance to cannon. Lower, thicker walls, reinforced by piling dirt against them (the "scarp," which was sometimes faced with masonry) better deflected and absorbed cannonballs and permitted the use of defensive cannon fire. Bastions provided a platform for cannons that allowed defenders to rake the curtain walls with fire (enfilade) and cover neighboring bastions. By the middle of the century, platforms in the curtain walls ("cavaliers") were added so that defenders could enfilade bastion walls, or fire into the bastion should it be taken by the enemy; a low flat wall outside the surrounding ditch, but fitted with parapets ("covered way"), enabled defenders to reconnoiter the activities of attackers and served as a staging area from which to conduct sorties.

In the course of the following 150 years, the depth of defensive works was developed enormously. Maurice of Nassau, prince of Orange (1567–1625), under the tutelage of the mathematician Simon Stevin (1548–1620), developed further outworks, particularly the ravelin, a fortified point that offered more angles for defensive fire outside the main walls. Fortification designs increasingly resembled star patterns, with a series of ditches, berms, and angled ravelins radiating from the polygonal perimeter of the city walls. The concern for depth of defensive works continued in the French corps of engineers and was brought to a baroque height by the followers of the great military engineer Sébastien le Prestre de Vauban (1633–1707).

Early modern fortifications systems were meant to act as a machine, each part interacting with another. By the onset of the seventeenth century, especially as the focus of European war was then centered on the struggles in the Netherlands, where broad flat land offered an empty canvas for the geometrical designs of engineers, the fortress was designed to take advantage of every possible angle from which any conceivable weapon could be employed. Built into the construction of a town wall and its outworks were plans for every foreseeable method of approach and point of breach by an enemy. Fortifications were tactics, but tactics that operated through a knowledge of mathematics, construction, and gunnery.

ON CAMPAIGNS

If, ideally, the role of the engineer in fortifications was to build into his design a retort to any plan of attack, the role of the engineer in the field was to alter the methods of attack in an unexpected and more efficacious way. It is for this reason that Vauban's most significant contribution to the warfare of his age was not his fortification design, but his novel system of trenches, dug in a zigzag or parallel way so that assailants could reach within range of rampart walls while remaining under cover, and his use of the ricochet fire of mortars to scatter defenders within their own walls. Techniques for driving forward a sap were in themselves a sort of exercise in earthwork construction: trench diggers moved forward, placing baskets filled with earth or rocks (gabions) before them and building up earthen walls along their sides, so that attacking troops could be moved toward the walls, or mines could be laid at the fortification's base. Ingenuity in this regard was considered so valuable that military men sometimes debated whether the shovel was not a more important instrument than the gun.

Management of guns and gunpowder devices was another of the main concerns of the military engineer. Engineers were usually attached to the artillery corps. Their skills in maneuvering machines that weighed anywhere from four hundred to eight thousand pounds were paramount. At the highest levels, engineers were artillery generals, although this rank was usually achieved by noble commanders trained in the engineers' arts and sciences so that, at
least, they could command their forces and supervise the engineers under them.

THE SCIENCE OF MILITARY ENGINEERS

Military engineering was transformed into a new profession around the relatively new arts of gunpowder warfare, and many of its practitioners insisted that it was a practice founded on science. By the end of the sixteenth century, an extensive literature on the various practical and intellectual demands of artillery warfare had rolled off the presses. Mathematics and measure were central to the new science of military engineering. In part, this was so because of the mathematical practices traditionally used by architects in their surveying, reconnaissance, and design activities. Military engineers and those who served them were among the most prolific producers of mathematical instruments and practical mathematical knowledge in the early modern period.

Ratio and measure, in fact, appeared to govern most of the new technical tasks, from the recipes for gunpowder (saltpeter, sulfur, and charcoal), to the charge of the cannon (from one-half to two-thirds the weight of the ball), to the measure of range, to proportioning of fortifications. The book knowledge at new academies for the training of cadets, such as the Accademia Delia in Padua, centered around mathematics. Mathematicians began to intervene in the sphere of military engineering as teachers of foundational (and elementary) mathematical skills and as inventors of new mechanical and ballistic knowledge.

Nicolò Tartaglia (1500–1557) was the first mathematician to seek to regularize the unpredictable art of gunnery through mathematics. Galileo Galilei (1546–1642), a student and a sometime teacher of military engineers, also tackled questions that originated in gunnery, even if his solutions were universalized and reframed to address phenomena far outside it. Galileo's "geometrical and military compass" was inspired by the "problem of caliber" (by which one could figure out the proper ratios among weight of gunpowder charge, weight of ball, and bore size), but it could carry out a great number of computational tasks. His years-long study of projectile motion and materials strength culminated in the publication of his last work, Discourses on Two New Sciences (1638), and contained his breakthrough formulations of kinematic motion. Ironically, the mathematical study of projectiles had yielded the philosophical marvel of a terrestrial physics compatible with Copernicanism, but, as Galileo recognized, it was not a useful guide to cannon shot since tables based on his work could not account for air resistance and other technical factors. One of Galileo's disciples, Evangelista Torricelli (1608–1647), did produce tables and instruments for mortar fire. Theoretically derived values are relatively accurate for these short-barreled, upward-shooting artillery pieces.

The problems of air resistance were taken up by Isaac Newton (1642–1727). Using Newton's work, Benjamin Robins (1707–1751) thoroughly investigated musket fire, both theoretically and experimentally. Robins's ballistic pendulum allowed him to demonstrate the dramatic effect of air resistance on the trajectory of a musket bullet and show that muzzle velocity is the most important parameter of artillery performance. However, although his work was translated by Leonard Euler (1707–1783) into German, with commentary, and into French, even engineers who knew Robins's work continued to use range as the significant parameter for another generation.

INSTITUTIONALIZATION AND REFORM

In the eighteenth century, technical schools were established for the development of national corps of military engineers. The French led, with formal engineering schools established by the artillery in 1720. These schools offered both practical and theoretical training, the latter again fashioned around a curriculum of mathematics. Graduates from the engineering schools in France became some of the country's leading scientists and political (or, at least, bureaucratic) leaders.

Meanwhile, European warfare began to move away from ponderous siegecraft. Armies had grown larger and more disciplined, and open battle, including more extensive use of field cannon, increased the mobility of warfare. While lighter field cannons had been experimented with since the sixteenth century, the effectiveness of light cannon in battle was dramatically demonstrated through the success of the Prussian army under Frederick II the Great (ruled 1740–1786). Following the successes of Frederick against the Habsburgs, Prince Joseph
Wenzel of Liechtenstein (1696–1772) commissioned a mathematics professor and captain in his artillery corps to redesign a system of guns that included cannons with shorter barrels and thinner walls on redesigned carriages. After the humiliating defeat of the French in the Seven Years' War (1756–1763), they looked to the experience of one of their engineers who had been in Austrian service, Jean Baptiste Vaquette de Gribeauval (1715–1789).

Gribeauval, eventually to become the first inspector-general of the artillery, instituted a number of reforms against the traditions of a much more developed system of military organization, artisanal production, and technical training than existed anywhere else in Europe. In the 1760s Gribeauval advocated similar technological reforms to those adopted in Austria. He also tried to establish the manufacture of gunlocks made with interchangeable parts and oversaw a revamping of the technical schools. The curriculum in engineering schools would teach algebraic analysis, Newtonian science, and the descriptive geometry of technical drawing. The values and mathematical emphasis of this education was foundational to the later establishment of the highécoles, models of technical education from the start and a source of French leaders to this day.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.