1 definition by jispeks

The idea that women are equal to men. That men should treat women like equals and that men should respect a woman as they would respect another man. Feminists do not dislike or hate men, they just believe that women should be treated with respect, paid the same as men in the work force and share domestic duties equally with men.
Feminism recognizes women as being equal to men.
by jispeks February 1, 2019
Get the feminism mug.