Women who have not only furthered the rights of women, but the rights of other groups as well. Feminists have left a profound mark on society. Early first-wave feminists were often aligned with the abolitionist movement, fighting to rid America of slavery. Feminists were/are also involved with the civil rights movement, the sexual revolution (see below), fighting for the rights of the LGBT community, etc.
Feminism encourages women to take control of their own sexuality, instead of fearing being branded a "whore" or a "slut" due to the double standard...and I don't see why anyone is complaining about that. Confident women comfortable with being sexy should be right up your alley.
Feminism encourages women to take control of their own sexuality, instead of fearing being branded a "whore" or a "slut" due to the double standard...and I don't see why anyone is complaining about that. Confident women comfortable with being sexy should be right up your alley.
by Fenderrrilla July 28, 2005