When Did the United States Become a Country?


Quick Answer

The United States became a country on July 4, 1776, when the Continental Congress formally endorsed the Declaration of Independence. However, the first time a foreign country officially recognized the United States as independent was in 1777, when Morocco recognized the independence of the United States.

Continue Reading
Related Videos

Full Answer

In 1778, France recognized the United States as a sovereign nation when representatives from both countries signed the Treaty of Amity and Commerce and the Treaty of Alliance. The next major power to recognize the United States as a nation was Denmark in 1781. Great Britain, although surrendering in 1781 to the United States, did not formally recognize the United States as sovereign until the signing of the Treaty of Paris in 1783.

Learn more about US History

Related Questions