Allies

The Second World War was what brought America to the forefront of the world stage whether they wanted it or not. The United States, entered the war with plans of victory and plans for the world after the war. During