1 definition by Ferdinand_the_Bull

A movement or social philosophy that advocates legal, social, and economic equality for women. Contrary to the beliefs of some misguided individuals, it derives no particular pleasure from the genital-related misfortunes of men, being rather preoccupied with such frivolous goals as getting paid the same goddamned money for doing the same goddamned work, not being forced to have babies, and not having to put up with shit from the entertainment, advertising and fashion industries that have a vested interest in making sure women hate their bodies.
"Feminism is a benevolent force that aids in providing the best opportunities for all members of society, and anyone who believes otherwise is a paranoid, reactionary fuckwit."
by Ferdinand_the_Bull June 12, 2008
Get the feminism mug.