arnold layne Posted July 31, 2012 Share Posted July 31, 2012 (edited) With the addition of a female referee into the National Football League next season it got me thinking; have women become the dominant gender?I have no qualms or issue with this so long as they take responsibility for what they are trying to achieve in feminism. It seems to me that women are offended when they aren't treated as equals, but at the same time they refuse to acknowledge some of the unfavorable gender roles men have and take that as well. For example, it's perfectly fine to challenge the norms in pro sports historically dominated by men, yet asking men out on dates or proposing to men is unusual or perhaps even frowned upon. Women have become dominant in a sense that they can push the boundaries and use sexism as a driving force for their actions. If they really want equality they should at least own up to everything and not pick and choose. This is my two cents, don't take it the wrong way. Edited July 31, 2012 by arnold layne Quote Link to comment Share on other sites More sharing options...
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.