During World War 1, there was a shortage of labor so women began working in jobs that were traditionally done by men, such as factories, offices, and transportation. As women took on these new roles, their position in society changed as they gained more freedom and independence, and demanded equal rights. After the war, women in many countries gained the right to vote in recognition of their contributions to the war effort, including working in hospitals to care for wounded soldiers.