When Did Germany Become a Country?


Quick Answer

Germany unified in 1871 as the German Empire, which was formed by the Prussians. While the borders of Germany changed over the years, the country remained a single entity aside from the split between East Germany and West Germany following World War II.

Continue Reading
Related Videos

Full Answer

Before Germany became a single country, its land was divided between various smaller nations and empires. However, many of the people living in these areas identified as Germans, and this unity helped the new nation thrive in its early years. Germany's young age led to tension with the older nations of Europe, including France and England, and it led to a series of conflicts that ultimately resulted in World War I.

Learn more about Modern Europe

Related Questions