Before World War I, women were not allowed to vote, be in the army or even have specific jobs. In fact, this work was considered man’s work. This all changed when America entered World War I, because a majority of the men in the United States were off at war. Women all across the United States were picking up the jobs that men left behind. This is when women first got a chance to start doing “men’s” work. They were then allowed to have jobs like police women, firefighters, bank clerks, and many more. New jobs were even created due to the world war. Women began working in ammunition factories. Women took on these jobs,but they were paid less. …show more content…
Because women were paid less, companies could continue to hire women and save money. In the government’s favor, employers went back to hiring men to do the jobs that they left behind.
Secondly, after World War I ended, the nineteenth amendment was passed. This amendment was passed on August 18, 1920 and allowed all United States citizens, no matter their gender, to vote without their votes being vetoed. Women would now have the right to vote. This was a start to the beginning of women’s rights. After the amendment was passed, the United States had to readjust to women being able to do new