When Did the United States Become a Country?
Last Updated Apr 15, 2020 3:58:51 AM ET
The United States became a country on July 4, 1776, when the Continental Congress formally endorsed the Declaration of Independence. However, the first time a foreign country officially recognized the United States as independent was in 1777, when Morocco recognized the independence of the United States.
In 1778, France recognized the United States as a sovereign nation when representatives from both countries signed the Treaty of Amity and Commerce and the Treaty of Alliance. The next major power to recognize the United States as a nation was Denmark in 1781. Great Britain, although surrendering in 1781 to the United States, did not formally recognize the United States as sovereign until the signing of the Treaty of Paris in 1783.
More From Reference

Why “The Immortal Life of Henrietta Lacks” Matters in a Post-Pandemic World

Are Purple Sea Urchins Viruses in the Ocean? Why You Should Be Eating More Uni

Greek Mythology Stories: Persephone and the Origin of the Seasons

All About Bunnies: 10+ Facts About Rabbits

What Is the Market Volatility Index, and How Does It Impact Your Investments?

8-Bit Oinkers: These Tech-Savvy Pigs Are Mastering Video Games