The idea that hospitals could be dangerous places emerged in the Middle Ages, when outbreaks of puerperal fever and typhus caused high mortality in infirmaries that were part of monasteries [1]. Even before the discovery of bacteria, Holmes and Semmelweis implicated health care workers in the transmission of puerperal fever; Nightengale and Farr emphasized that safe food and water and a clean environment could substantially reduce the rate of deaths caused by infection; and Simpson concluded that “hospitalism” contributed to patient death and illness [1]. In the 20th century, nosocomial infections have changed and their numbers have increased in response to antibiotic agents, the use of invasive devices, and more aggressive medical therapy. By comparison, prevention efforts have lagged, and their effects are still being assessed in ounces rather than in pounds. This is due, in part, to a health care system that has primarily focused on diagnosis and treatment rather than on prevention.