Skip to main content

Urban_Dictionary.User's definitions

Feminist

Feminists are people who believe in equality for women. It’s a role for any gender. If people think it’s cancer, they lack medical knowledge.
Leo and Lisa are feminists because they believe that women should have the same paying wages as men.
by Urban_Dictionary.User December 10, 2020
mugGet the Feministmug.

Share this definition