What Role Did the U.S. Play in WWI?
The United States played a role in WWI by joining the Allied Forces in 1917. On Dec. 7, 1917, the United States officially declared war on Austria-Hungary.
World War I began on July 28, 1914, when the heir to the Austria-Hungary throne was assassinated by a Serbian. After his death, Austria-Hungry forces began preparing to invade Serbia. This led Germany to invade Belgium, Luxembourg and, eventually, France as they were also allies of Russia. Due to the German invasion, the British declared war on Germany, and Britain, Russia and France became known as the "Allies." After time, other countries, including China, Japan and Italy, joined the Allies in the fight against Germany and the Austria-Hungarians.
Throughout the course of the war, the United States government vowed to stay neutral. This all changed, however, after German forces sunk a British passenger vessel carrying over 1,000 people. Out of the 128 who died, many were Americans. This attack enraged American citizens, and they began pressuring the government to take part in the war. The president at the time, Woodrow Wilson, continued to vow neutrality, but after Germany stated it would continue to attack passenger ships, the United States entered the war. WWI ended 2 years later with the Treaty of Versailles.