Gender Roles in America
The World Health Organization defines gender as the socially constructed characteristics of women and men, such as norms, roles, and relationships of and between groups of women and men that varies from one society to the other" (World Health Organization, 2019). In the United States, gender has often been binary as either male or female. Thus, there are roles, norms, and behaviors that American society expects from men and women.
Initially, American women were only accustomed to doing domestic chores. This includes taking care of the children, cooking for the family, and ensuring the home is clean. Therefore, marriage was a big deal for women as they needed a husband to work and provide for them. Homes survived on single-income models, where the man (husband or father) was responsible for the family's financial well-being while women sat back and managed homes.
However, this has changed as there is more role integration. More families now have dual-income systems, where both spouses earn a living to support the family. The emergence of daycares helps even mothers to leave their children in a safe place and go to work. Such has greatly benefitted women, who can now raise families without depending on men. Single mothers work just as hard in different professions to raise their families in ways that please them, and men are getting more involved in sharing the responsibility of nursing children.
Post the pandemic, I feel women will continue being represented even more in the corporate world. This is especially possible with the flexibility of work that COVID has brought, enabling women to respond to their corporate duties remotely. There is no need to take extended maternal leaves anymore hence more availability. This way, more women will land senior roles in the employment sector and maintain them over long periods with proper planning.
World Health Organization. (2019, June 19). Gender. Retrieved from Who.int website: https://www.who.int/health-topics/gender