I Was a Feminist Until I Learned I Couldn’t Sexually Harass My Coworkers
I like to think of myself as a progressive. I read the literature, I try to keep up to date on the latest important social causes, and I try to stay on the right side of history. Because of this, I’ve always considered myself a Feminist.
But then one day at work while talking to a female coworker, and making my usual handsy moves on her by the water cooler, I was informed that not only did feminism dictate that it’s not right to sexually harass my female coworkers, but it was infact “egregiously wrong.”
Well this sure put my established worldview into question!
After further research online, I found out that according to feminism, men were expected to treat women as equals and refrain from any form of objectification, sexualization, or harassment.
Having long believed cat calls were the best form of compliment, I was appalled, and decided to make a change. Now, I no longer consider myself a feminist.
I enjoy the rush I get from asserting my dominance over women by harassing them too much to give it up for ‘feminism.’ If we don’t reverse these harmful effects of social progress, how will I or millions of other men ever be able to make a woman feel violently uncomfortable in a safe public place ever again?