1 definition by 🖤Saharan🖤

Feminism is, or is supposed to mean, equality in all things between men and women. It has however come to mean advocacy for the school of thought that believes women are better than men. This is not feminism. As a feminist, I believe that women and men should be given equal opportunities. Men shouldn't be expected to sacrifice for women because they are women, and vice versa. What has destroyed feminism today is a bunch of ignorant women who believe that because they are women, they have a right to think of themselves as better than men. That's not feminism. That's giving gender discrimination a role reversal.
Feminism doesn't translate to mean 'men are scum' . Stop calling yourself a feminist if that's what you think it is.
by 🖤Saharan🖤 February 15, 2018
Get the Feminism mug.