1 definition by ClemsonGirl

Top Definition
Feminism is the belief system that women and men are equal.
While the stereotype of cuttroat feminists is still burned into the minds of the close-minded (see the other definitions for feminism)it is only a derogatory stereotype used primarily by the ignorant. Feminism is believeing that women deserve the same rights, respect, and chances as men.
Quotes demonstrating my definition:

Real definition of Feminism:
"Feminism is the radical notion that women are human beings"
-Cheris Kramerae, author of A Feminist Dictionary, 1996.

Stereotype:
"Feminism encourages women to leave their husbands, kill their children, practice witchcraft, destroy capitalism and become lesbians"
-Rev. Pat Robertson,
1992 Republican Convention
by ClemsonGirl August 22, 2004

Free Daily Email

Type your email address below to get our free Urban Word of the Day every morning!

Emails are sent from daily@urbandictionary.com. We'll never spam you.

×