1 definition by Fenderrrilla

Top Definition
Women who have not only furthered the rights of women, but the rights of other groups as well. Feminists have left a profound mark on society. Early first-wave feminists were often aligned with the abolitionist movement, fighting to rid America of slavery. Feminists were/are also involved with the civil rights movement, the sexual revolution (see below), fighting for the rights of the LGBT community, etc.

Feminism encourages women to take control of their own sexuality, instead of fearing being branded a "whore" or a "slut" due to the double standard...and I don't see why anyone is complaining about that. Confident women comfortable with being sexy should be right up your alley.
Where would we be without feminists?
by Fenderrrilla July 27, 2005

Free Daily Email

Type your email address below to get our free Urban Word of the Day every morning!

Emails are sent from daily@urbandictionary.com. We'll never spam you.