The evolution of hospitals in the Western world from charitable guesthouses to centers of scientific excellence has been influenced by a number of social and cultural developments. These influences have included the changing meanings of disease, economics, geographic location, religion and ethnicity, the socioeconomic status of clients, scientific and technological growth, and the perceived needs of populations. [1]

A nursing tradition developed during the early years of Christianity when the benevolent outreach of the church included not only caring for the sick but also feeding the hungry, caring for widows and children, clothing the poor, and offering hospitality to strangers. This religious ethos of charity continued with the rapid outgrowth of monastic orders in the fifth and sixth centuries and extended into the Middle Ages. Monasteries added wards, where to care meant to give comfort and spiritual sustenance. Religious orders of men predominated in medieval nursing, in both Western and Eastern institutions. [2] The Alexian Brothers in Germany and the Low Countries, for example, organized care for victims of the Black Plague in the fourteenth century. Also at this time, cities established institutions for people with contagious diseases such as leprosy.

During the medieval and early Renaissance eras, universities in Italy and later in Germany became centers for the education of medical practitioners. The idea that one could recover from disease also expanded, [3] and by the eighteenth century, medical and surgical treatment had become paramount in the care of the sick, and hospitals had developed into medicalized rather than religious spaces. They also grew in size. Large hospitals, consisting of a thousand beds or more, emerged during the early nineteenth century in France when Napoleon established them to house his wounded soldiers from his many wars. These hospitals became centers for clinical teaching. [4] Then in 1859, Florence Nightingale established her famous nursing school—so influential on future nurses’ training in the United States—at St. Thomas’s Hospital in London.

In the United States, cities established isolation hospitals in the mid 1700s, and almshouses devoted to the sick or infirm came into being in larger towns. However, almshouses were not intended to serve strictly medical cases since they also provided custodial care to the poor and destitute. Benjamin Franklin was instrumental in the founding of Pennsylvania Hospital in 1751, the nation’s first such institution to treat medical conditions. Physicians also provided the impulse for the establishment of early hospitals as a means of providing medical education and as a source of prestige.[5] For most of the nineteenth century, however, only the socially marginal, poor, or isolated received medical care in institutions in the United States. When middle- or upper-class persons fell ill, their families nursed them at home. [6]Even surgery was routinely performed in patient’s homes. By late in the century, however, as society became increasingly industrialized and mobile and as medical practices grew in their sophistication and complexity, the notion that responsible families and caring communities took care of their own became more difficult to apply. The result was a gradual shift toward the professionalization of health care practices that eventually included the development of a full and competitive commercial market for medical services that increasingly took place in hospitals. [7] Nursing played a significant role in the move from home to hospital. As historian Charles Rosenberg wrote in his classic book, The Care of Strangers, the professionalization of nursing was “perhaps the most important single element in reshaping the day-to-day texture of hospital life.”[8]

Privately supported voluntary hospitals, products of Protestant patronage and stewardship for the poor, were managed by lay trustees and funded by public subscriptions, bequests, and philanthropic donations. By contrast, Catholic sisters and brothers were the owners, nurses, and administrators of Catholic institutions, which, without a large donor base, relied primarily on fundraising efforts along with patient fees. Public or tax-supported municipal hospitals accepted charity patients, including the aged, orphaned, sick, or debilitated. Some physicians established proprietary hospitals that supplemented the wealth and income of owners. Owners of not-for-profit voluntary and religious hospitals on the other hand took no share of hospital income. Physicians also developed specialties such as ophthalmology and obstetrics and opened their own institutions for this new kind of practice.[9]

Nonetheless, argues historian Rosemary Stevens, at the beginning of the twentieth century, “the hospital for the sick was becoming ‘more and more a public undertaking.’” [10] A national census of benevolent institutions, which included voluntary, religious, and public or governmental institutions, was published in 1910. Of all the patients admitted for that year, 37 percent of adults were in public institutions.[11] The same census documented public appropriations according to class of institutions. Public funds included all those from federal, state, county, or municipal sources. Of 5,408 institutions reporting (hospitals, dispensaries, homes for adults and children, institutions for the blind and the deaf), 1,896 (35 percent) were recipients of public aid from one source or another. Looking only at hospitals, 45.6 percent of them received public appropriations, although they received the largest part of their income from patients who paid either or all of their hospital charges. Still, for all institutions taken together, 31.8 percent of their total income was from public finds. These figures should be interpreted with caution, since hospitals in 1910 did not use the same cost accounting principles that we use today. However, the census data suggested that an awareness of the need for public support of hospital care was increasing. The actual amounts of public appropriations received during 1910, according to geographic region, are shown in Table 1. Regional variations occurred, and there was a predominance of public aid to hospitals in the Northeast.

Other regional variations in hospital development reflected regional economic disparities, particularly in the South and West, where less private capital was available for private philanthropy. This hindered the creation of voluntary hospitals. [12]Religious institutions were often the first ones built in these areas. Between 1865 and 1925 in all regions of the United States, hospitals transformed into expensive, modern hospitals of science and technology. They served increasing numbers of paying middle-class patients. In the process, they experienced increased financial pressures and competition.

One of the defining characteristics of hospitals during this period was the way the power of science increasingly affected hospital decisions. By 1925, the American hospital had become an institution whose goals were recovery and cure to be achieved by the efforts of professional personnel and increasing medical technology. Hospitals functioned with the advantages of x-rays, laboratories, and aseptic surgery, making hospital operating rooms, with all their technical equipment and specialized personnel, the safest and most convenient places to perform surgery. [13] As nurses became more important to hospitals, so hospitals became sites for nursing education. In hospital-based nurse training programs, nurses learned under the apprenticeship system, with hospitals utilizing students to provide much of the patient care while graduate nurses went into private duty. During the Great Depression, however, as fewer people could afford private duty nurses, more graduate nurses returned to work in these institutions, although they worked at reduced wages.

In 1932, during the nadir of the Great Depression, a hospital census conducted by the Council on Medical Education and Hospitals revealed a shift of usage from privately owned hospitals to public institutions. There were 6,562 registered hospitals, a decrease from the 6,613 reported by the previous census. Of the 776 general hospitals run by the government, 77.1 percent occupied at capacity. By contrast, only 55.9 percent of the 3,529 nongovernmental general hospitals were filled. Still, between 1909 and 1932, the number of hospital beds increased six times as fast as the general population (Figure 1), leading the Council to assert in 1933 that the country was “over hospitalized.” [14]Meanwhile, patients were turning to a new method of paying for hospital charges as Blue Cross insurance plans became more and more popular and accounted for a greater percentage of hospital financing.

Figure 1: Hospital Capacity and General Population, 1872-1932

Source: “Hospital Service in the United States: Twelfth Annual Presentation of Hospital Data by the Council on Medical Education and Hospitals of the American Medical Association,” JAMA 100, 12(March 25, 1933): 887.

A surge of demand occurred after World War II. Although federal, state, and local governments had given some support to hospitals earlier in the century, the government became increasingly important in the health care system after the war, adding huge amounts of money to hospital enterprises: The Hill Burton Act in 1947 provided funds for the construction and expansion of community hospitals. The National Institutes of Health expanded in the 1950s and 1960s, stimulating both for-profit and non-profit research. Moreover, Medicare and Medicaid, established in 1965, provided money for the care of the aged and the poor, respectively. [15]

For all its support, however, the costs of hospital care grew even faster. As Rosemary Stevens argues, from its inception, Medicare costs surpassed projections. In 1965, for example, Medicare costs were projected to be $3.1 billion. Five years later, however, they reached $5.8 billion, an increase of 87 percent. Less than 10 percent could be linked to expanded utilization; 23 percent to rapid economic inflation; and the remaining two thirds to “massive expansions in hospital payroll and non-payroll expenses —including ‘profits,’” with a doubling of average patient-day costs between 1966 and 1976. [16]

In the 1950s, 1960s, and 1970s, rising public expectations for nursing and medical attendance as well as the recognition by nurse and physician reformers that some patient-care procedures were unsafe drove a reorganization of nursing care. In the hospitals themselves, intensive care units grew and machines became ever more prevalent. Both of these developments required greater expertise among nurses. Nursing education began the move from 3-year hospital-based diploma programs to 4-year baccalaureate programs in colleges and universities. By 1965, over 90 percent of large hospitals and 31 percent of smaller ones had intensive care units staffed by increasingly expert nurses. [17]

In 1970, the American Hospital Association listed 7,123 hospitals in the United States, up 247 from 1960. During this decade, however, a major shift had occurred in hospital utilization. The number of beds in federal, psychiatric, tuberculosis, and other long-term care facilities had declined, while, aided by government funding, community hospitals increased their bed capacity by 32.7 percent (Table 2). These nonfederal, short-term care institutions that were controlled by community leaders and were linked to the community’s physicians to meet community needs represented 82.3 percent of all hospitals, contained over half of all hospital beds, and had 92.1 percent of all admissions.

Community hospitals also offered more comprehensive and complex services such as open heart surgery, radioisotope procedures, social work services, and in-house psychiatric facilities. [18] The growth of these hospitals, along with the advent of new treatments and new technologies, contributed to escalating in-patient hospital costs, leading the federal government to impose wage and price controls on hospitals in 1971. Indeed, the years after 1965 and the passage of Medicare and Medicaid were pivotal for everyone in health care because of increased government regulation. Medicare incorporated a prospective payment system in 1983, with federal programs paying a preset amount for a specific diagnosis in the form of Diagnostic Related Groups, or DRGs. [19]As third party payers gained power and status, DRGs radically changed Medicare reimbursements. They also considerably altered hospital decisions, with a focus changing toward greater efficiency. [20]

The 1980s also witnessed the growth of for-profit hospital networks, resulting in increased vulnerability of smaller not-for-profit institutions. More than 600 community hospitals closed. [21] It was at this time that both for-profit and not-for-profit institutions began forming larger hospital systems, which were significant changes in the voluntary hospital arena. A system was a corporate entity that owned or operated more than one hospital. This also has come about with the advent of DRGs as single health care facilities seek to affiliate to cut down on duplication of costs.

Cost containment was the theme of hospitals in the 1990s. The balance of power in these institutions shifted from caregivers to the organized purchasers of care, with Medicare and Medicaid becoming a huge governmental influence in all types of hospitals. In the private sector, insurance companies began to take a more active role in managing hospital costs. Health maintenance organizations, which contracted with a network of providers for discounted prices, increased in importance. The focus of care shifted to outpatient services, ambulatory care centers for acute care, and hospices and nursing homes for the chronically ill. [22] Then in 1997, the Balanced Budget Act decreased Medicare payments to hospitals by $115 billion over five years, including a projected $17 billion reduction in Medicare payments to hospitals. [23]

At the turn of the twenty-first century, rising costs have forced many hospitals to close, including public hospitals that have traditionally served as safety nets for the nation’s poor. Some of the larger not-for-profit corporations have bailed out public facilities through lease arrangements, such as the one between the Daughters of Charity’s Seton Medical Center and the public Brackenridge Hospital in Austin, Texas, that occurred in 1995. [24]These types of arrangements have had their own problems, however, such as the complications that arise when a large secular organization such as Brackenridge tries to join forces with a hospital whose policies are dictated by its religious affiliation.

If the professionalization of nursing has had the important effect on the quality of the hospital experience that Charles Rosenberg has suggested, the changes in the nature of hospitals have had a profound effect on the profession of nursing, since the vast majority of nurses practice in a hospital setting. The future of both the hospital as an institution and nursing as a profession will depend on the decisions we make in the coming years about how health care is provided and to whom.

[14] “Hospital Service in the United States: Twelfth Annual Presentation of Hospital Data by the Council on Medical Education and Hospitals of the American Medical Association,” Journal of the American Medical Association 100, no. 12 (March 25, 1933): 887.

[19] Phil Rheinecker, “Catholic Healthcare Enters a New World,” in Christopher Kauffman, A Commitment to Healthcare: Celebrating 75 Years of the Catholic Health Association of the United States, (St. Louis: The Catholic Health Association of the United States, 1990), 44; Mike Brennan, “Hospitals Competed in Changing Times,” Everett Herald, August 15, 1993, n.p.

Barbra Mann Wall is an Associate Professor of Nursing, the Evan C. Thompson Endowed Term Chair of Excellence in Teaching, University of Pennsylvania, School of Nursing and Associate Director of the Barbara Bates Center for the Study of the History of Nursing.