More In This Article

It was called the sunshine cure, and in the early 20th century, before the era of antibiotics, it was the only effective therapy for tuberculosis known. No one knew why it worked, just that TB patients sent to rest in sunny locales were often restored to health. The same “treatment” had been discovered in 1822 for another historic scourge, rickets—a deforming childhood condition caused by an inability to make hardened bone. Rickets had been on the rise in 18th- and 19th-century Europe, coinciding with industrialization and the movement of people from the countryside to the polluted cities, when a Warsaw doctor observed that the problem was relatively rare in rural Polish children. He began experimenting with city children and found that he could cure their rickets with exposure to sunshine alone.

By 1824 German scientists found that cod-liver oil also had excellent antirickets properties, although that treatment did not catch on widely, in part because the possibility that a food might contain unseen micronutrients important to health was not yet understood by doctors. And nearly a century would pass before scientists made the connection between such dietary cures for rickets and the beneficial effects of sunshine. Early 20th-century researchers showed that irradiated skin, when fed to rats with artificially induced rickets, had the same curative properties as cod-liver oil. The critical common element in the skin and the oil was finally identified in 1922 and dubbed vitamin D. By then the idea of “vital amines,” or vitamins, was a popular new scientific topic, and subsequent research into the functions of vitamin D in the body was very much shaped by D’s image as one of those essential micronutrients that humans can obtain only from food.