West Germany

noun

1.

a former republic in N central Europe, on the North Sea: established in 1949 from the zones of Germany occupied by the British, Americans, and French after the defeat of Nazi Germany; a member of the European Community; reunited with East Germany in 1990 Official name Federal Republic of Germany See also Germany

West Germany definition

Note: Established in 1949, after dissension between the United States and the Soviet Union led to the division of Germany into East Germany and West Germany, it was formed out of the states included in the American, French, and British occupation zones.

Note: The Bonn Convention in 1952 essentially granted West Germany national sovereignty. In 1955, West Germany was recognized as an independent country.

Note: It made a swift recovery, called the “economic miracle” from the devastation of World War II.