Topic > preconception that women should be submissive, society continues to prevent them from being dominant in the workplace. Traditionally women are expected to be housewives and take care of their children, but over time these views have changed. In today's society, women can not only be mothers, but they can also work outside the home. Women, considered single mothers, do every job both at home and in the workplace. Men see women as inferior because until about the 20th century women began to gain just a little bit of independence. This ties into gender roles, there have been many studies on how children are raised, when a girl is little, parents give her a doll to raise; which means society places this burden on girls to raise. Some people say that we are the person we have become because that is how our parents taught us to be. This is true, a woman is taught to care, to nurture, to clean the house and to be a wife. In the end men will always be as they were taught to be superior. Society is slowly changing but women are still limited in the world of work. Throughout history, women have always been seen as inferior to men or really anyone. According to Stephanie Muntone in her article “Women's Rights in the 19th Century” she states that “a woman was seen as a second class citizen in a republic founded on the principles of freedom and equality”. America is the country where everyone is considered equal, but this is only according to rich white people. Women didn't even have the right to vote; when they finally did, they felt somewhat independent. According to Muntone in London, there was a world anti-slavery congress that women could attend, but conventional leaders conceded only… half the paper… would not be the cause of low wages and opportunities. American society prevents woman from being dominant in the workplace due to the fact that women are well known for being submissive. Many people joke that women should be in the kitchen, but to what extent is this really a joke? Women are commonly assumed to be housewives to “work” in the home kitchen and take care of their children. Housewives believe that no matter how arduous the housework, they don't feel like work.[xix] Housework is not paid work, it is more of a responsibility placed on a woman. In today's society these views have changed, women now have more job opportunities and it is normal for a woman to have a job. Although women work, they are still seen as subservient due to the unequal treatment and pay they receive compared to men..