When did American women start being treated equally?
Topic: Right to work laws research articles
May 20, 2019 / By Jetta Question:
So I got into Mad Men, which takes place in the 1960's, so of course the women in the show are treated as inferior. It doesn't seem to be too terrible, because they have jobs with the men and are allowed to own businesses and stuff, but they're still kind of expected to have dinner ready when the men get home, not raise their voice to a man, and not have anything to do with the finances of their family. It just made me curious as to when, exactly, all this died down. What time period or decade did women stop being expected to serve their husbands and all that other stuff? Like, when did marriage truly become more of an equal partnership, once and for all? And was it a certain event, like the ratification of an amendment, or did society just gradually become accustomed to women being equal beings (because of all the women's rights groups and laws), and if so, when did that happen?
Thanks so much!
Best Answers: When did American women start being treated equally?
Francene | 3 days ago
I wouldn't take a television program to be an accurate depiction of reality.
That said, men and women are still not treated equally.
Research shows that school teachers give girls better grades than boys, even when they perform worse on standardized tests.
Women benefit from women-only scholarships.
Women are a majority on college campuses, and therefore earn more degrees than men.
Government assistance is available specifically for women entrepreneurs.
As a result of these and other pro-women policies, young women out-earn their male peers.
In the US, women control the largest percentage of personal wealth, and account for 85% of all consumer purchases.
Meanwhile, men account for the vast majority of workplace injuries and fatalities (93%) due to the fact that men are more likely to work in hazardous occupations.
Despite this fact and the fact that men have a shorter life expectancy, the government focuses its attention on women's issues with multiple offices and agencies for women's issues/health.
No similar federal agencies exist for men's issues/health. In fact, roughly twice the federal funds are spent on breast cancer research as are spent on prostate cancer research.
👍 214 | 👎 3
Did you like the answer? When did American women start being treated equally?
Share with your friends
We found more questions related to the topic: Right to work laws research articles
I'm 60 and have been pleased to see in my lifetime the advances women have made across the board. This is still hard work in progress, but my Lord, how the sacrifices of women and the men who supported them have made such a difference.
30 years ago, you look at your average orchestra. 90% men, and the leads were almost always male. This came crashing down when there was a 'hidden application process' for first trombone in an orchestra due to one of the applicants being the son of someone with influence. The winner was a woman, which floored everyone. That created a chink in the armor that kept women out of the better positions. Now, orchestras are close to 50-50. That's just a sample.
I pray they continue their labors into equality, and they will have my full support.
👍 90 | 👎 -6
I'd say the 1980's women seem to be more equal than past decades. It was a gradual progression of the woman's movement. Generations of Chauvinistic men had to have retired from the workforce before this change was able to take place.
👍 88 | 👎 -15
I think it was the 60s and early 70s when equality had been met and we saw women start to be advantaged over men.
Obviously the right to vote in 1920 was a big one in terms of gaining equality. The equal pay act and affirmative action were early 60s.
👍 86 | 👎 -24
I think it depends on individuals and the sexism that some families have and others don't. My guess is that most people dropped sexist bias in the 80s or 90s but there are people right here in Gender Studies that are every bit as sexist as those back in the 60s - maybe even more. In other words sexism is still alive and well in the USA - just not as much as before.
👍 84 | 👎 -33