Health is an important profession in the United States, and in order to get a career off on the right feet, it is important to attend a school that will adequately prepare you. Here is a list of the best health schools in America.