There were many positive effects of the Civil War; for example, slavery was banned, citizenship was granted to all people born in the U.S., and the women's rights movement gained traction. The nation came together when it witnessed that states could not simply choose to leave the union, and the idea of the states as one single country became more prominent.
The Civil War allowed the country to band together once again and strengthened the people as a whole. Perhaps the most enduring symbol of the Civil War is the unification of the people and of the states. While many people simplify the Civil War and claim that it was about whether or not slavery should be allowed in the U.S., the war was really about whether or not the states had the right to secede and how far state rights extended.
The U.S. began as a country that ignored leadership when the people felt that their leaders were in the wrong. This is most visible by the way that the people behaved before, during and after the American Revolution in regards to British rule. The Civil War helped to settle the issue of what powers belonged to the states and what powers belonged to the federal government. Reaching an agreement on this power delegation helped to create a more powerful and united country.