The Truth About Feminism

Feminism is the advocating the social, political, and economic equality of the sexes and promoting gender equality in society. Feminism is not misandry, which is hatred toward men or women wanting power over men in society.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

Feminism is the advocating the social, political, and economic equality of the sexes and promoting gender equality in society. Feminism is not misandry, which is hatred toward men or women wanting power over men in society. These are two terms that I believe have been widely confused by men and women. Society sees "feminism" as a dirty word that has no real value or purpose that was created by angry women. In reality, feminism is an equally important movement for men and women. It not only helps to create more opportunity and equality for women, but it understands the importance of fixing men's issues in society alongside women's.

Feminism aims to improve the equality between men and women in the job world, in the social world, and in the political world. It aims to empower women to prove that we are capable of accomplishing anything that a man can and that men and women deserve the same respect, while also improving the lives of men rather than condemning them for the actions and beliefs of misogynists.

Feminism aims to destroy gender stereotypes and roles, allowing women to enjoy sports without being seen as less knowledgeable or allowing men to enjoy knitting without being seen as less of a man. Feminists want men and women to chose their own roles in relationships as well as in life, allowing a man to be a stay at home parent without being looked down on for not being the breadwinner or allowing a woman to hold a high position in a career without being condemned for not staying at home with her children or for not having children at all. Its goal is to end the stereotypes that women must stay in the kitchen cooking and cleaning while the men are out making money and allow every man and woman the chance to do whatever they please within relationships and in the world without criticism from society.

Destroying the gender roles that have been engrained in society would create more career, social, and political opportunities for both men and women. If women were no longer expected to stay at home or were no longer seen as sexual objects, the jobs that are specifically aimed at attractive women (cocktail waitressing, sexual roles in tv and film, etc) would be opened up to men and more women would be willing to venture into the job world if the opportunities were based on skill rather than gender or attractiveness. The wage gap (which does actually exist and is especially obvious in Hollywood, despite the doubt that is created by the numerous and sometimes difficult to understand factors) could be closed and allow women to work the same jobs for the same amounts of money without trying to take anything away from the men.

Politics would be able to be opened up for women to be better represented by another woman who understands their experiences and help make informed decisions on women's issues. In the social world, men would no longer have to be the ones to pay for bars and clubs while women are allowed in for free and dating would become more of an equal effort from men and women.

Feminism aims to destroy the idea that men must be aggressive and strong and controlling in order to be "real" men and the idea that women must be nurturing and sweet and complacent in order to be a "real" woman. Feminism wants to break down the expectations that society holds on men to be masculine and tough all the time. It wants men to be able to speak out about domestic violence from a woman or being sexually assaulted without being seen as weak or just being brushed aside as if it doesn't happen. It wants to acknowledge that, although it happens less often, women can and do rape and abuse men.

Feminism wants society to stop portraying men as animals that have no control over their sexual urges and stop assuming that women always want sexual attention simply because of the way they dress. It is trying to destroy the idea that sexual assault and harassment toward women is simply a result of "boys being boys" because good men deserve more credit than being generalized as animalistic, sexually out of control beings. It wants both men and women to not become victims of blaming and scrutiny for speaking about sexual attacks that have happened to them.

Feminism wants all people to be listened to and understood when they choose to come out about assaults so that they can have justice for what they've endured, whether it is a man or a woman. Feminism doesn't assume that all men are evil and misogynistic, but rather that men can be and are good and should not all be condemned for the actions of male rapists and abusers.

Feminism's goal is to create a society where men are taught that expressing their feelings does not make them less of a man and where women are taught that they are strong too. It wants a society where men are no longer taught that "crying is for girls" so that they can actually communicate their feelings and problems and have more successful relationships with women throughout their lives. It wants a society where men aren't expected to to always be the moneymaker of a family and women are able to provide for themselves without expecting a man's support. Feminism wants a society where a woman is seen as an individual before she is seen as a sexual being or a baby maker and where a man and woman can be seen equally as parents and treated fairly in parental or custodial issues.

Feminism is about creating a world of equality, free of prejudice, judgment, or hatred between the sexes.

More about feminism, music, photography, mental health, and more on my personal blog, Janellisms.

Popular in the Community


What's Hot