feminist

(NOUN)
A person who wants females to have equal rights to males.
Anti-feminist: "Women shouldn't have rights! They should stay at home and shut up."
Feminist: "Women deserve rights just as much as anyone else, and no woman should be silenced!
by bffrman January 26, 2023
Get the feminist mug.