Herbal medicine is the use of plants, or natural plant substances, to positively affect good health. Millions of Americans use herbs to enhance their health. Nature’s Way is a market leader in herbal supplements.