1 definition by Ferdinand_the_Bull

Top Definition
A movement or social philosophy that advocates legal, social, and economic equality for women. Contrary to the beliefs of some misguided individuals, it derives no particular pleasure from the genital-related misfortunes of men, being rather preoccupied with such frivolous goals as getting paid the same goddamned money for doing the same goddamned work, not being forced to have babies, and not having to put up with shit from the entertainment, advertising and fashion industries that have a vested interest in making sure women hate their bodies.
"Feminism is a benevolent force that aids in providing the best opportunities for all members of society, and anyone who believes otherwise is a paranoid, reactionary fuckwit."
by Ferdinand_the_Bull June 11, 2008

Free Daily Email

Type your email address below to get our free Urban Word of the Day every morning!

Emails are sent from daily@urbandictionary.com. We'll never spam you.

×