1 definition by RespectfulDicc

A political movement that seeks for equality for men and women. Feminists do not claim that women are better than men in any way.
Both men and women can be feminists, as no gender is claimed to be superior.
Person 1: 'Feminism is so important! We should show the world that women are better than men. All men are rapists!'
Person 2: 'Actually, that's very a very sexist thing to say. Begone, thot, you ain't no feminist.'

Feminists believe that men and women have equal rights, e.g.:
-Both men and women should be able to wear make-up.
-Both men and women should be able to not wear make-up.
-The percentage of rapists is actually very small, but all cases of rape should be taken serious. Everyone is innocent until proven otherwise.
-Men and women should be payed equally. (in 1963, the Equal Pay Act was signed into law in the USA, but studies show that there is still a gender pay gap that varies between aproximately 9.1 percent and 70 percent.)
by RespectfulDicc November 30, 2018
Get the Feminism mug.