Urban_Dictionary.User's definitions
Feminists are people who believe in equality for women. It’s a role for any gender. If people think it’s cancer, they lack medical knowledge.
Leo and Lisa are feminists because they believe that women should have the same paying wages as men.
by Urban_Dictionary.User December 10, 2020
