Hot Student Stories
top-5-grammar-and-spelling-checkers-to-rock-in-2019

What happened to jobs for women after the end of World War I? Select one: a. Women surrendered their jobs to returning soldiers. b. Women sought advanced training to get professional jobs. c. Women formed labor unions to fight discrimination in the workplace. d. Women were able to get better paying jobs using skills learned while working during the war.

Kevin Sutter

in History

follow
followin
1 answer
4 views

1 answer


Zach Chandler on November 16, 2018

The women were able to obtain better-paying jobs using the skills learned during the process of work during the war is what happened to jobs for women after the end of World War I is the best response of all of the above


Add you answer