preview

What Are Gender Roles In American Culture

Decent Essays

Gender roles are defined differently by people and are usually centered on opposing conceptions of femininity and masculinity. They are a set of societal norms dictating what types of behaviors are generally considered acceptable, appropriate, or desirable for a person based on their actual or perceived sex. Gender roles now compared to back in the day have very much changed, but there still remains a strong influence as to the way a woman should act and how a man should act. In places such as America, now, women and men both have jobs and can still balance spending time with their family and taking care of children whereas back then, women were housewives and were not able to hold a job because that was the man’s responsibility. The traditional roles of men and women are also looked at, as well as behaviors of men and women that are considered socially appropriate. Gender roles vary greatly from one culture to the next, from one ethnic group to the next, and from one social class to another. But every culture has them, they all have expectations for the way women and men should dress, behave, and look. …show more content…

Each country is different as each has a different set of beliefs, traditions, and values that they follow and are of importance to them. Within and across different cultures, there is some consistency in the standards of desirable gender role behavior. Males are expected to be independent, straight forward, and competitive whereas females are expected to be more emotional, sensitive, and supportive. These beliefs have changed little over the years in the United States and apparently around the world also. However there is some variation in cultural gender role standards across cultures not just the United

Get Access