How Did World War II Change Women's Lives in the United States?


Quick Answer

World War II brought changes to women's traditional roles and saw large numbers of women in the armed forces and working in nontraditional jobs at home. While the women who joined the armed forces took traditional noncombat roles like nursing, those on the home front took on industrial jobs.

Continue Reading
Related Videos

Full Answer

During World War II, the number of married women in the workforce outnumbered that of single women for the first time in United States history. Women were encouraged to work outside the home in jobs that could aid the war effort. Some fields such as science were no longer dominated by men. As a result, women gained financial freedom and more independence.

Learn more about World War 2

Related Questions