1 definition by Oh-I'm-Sorry-Am-I-Threatening?

Top Definition
Feminism is a political, social, and cultural movement that aims at equal rights for women. Its beginnings and extreme controversy in a (Western) patriarchal society has created stigma and stereotypes associated with the definition. Some include that feminists are:
-lesbians
-man haters
-'hairy-legged bitches'
Feminism helped women get the vote, obtain equal rights for jobs, made laws to control domestic violence, help women obtain the rights to own property, to divorce, to have access to birth control and to have possession of their own bodies.
Feminism is still needed in today's society. One can see this through the other versions of this definition.

Man = Woman, no more, no less. Women were, and in social aspects, still are, treated as inferior.
Sexism began as early as Aristotle, who claimed that women were 'imperfect' men since they did not contain a penis on their persons.
To be superior, one must make something inferior. I am sure this definition will get many 'thumbs down', but it is the truth.

I am a wo-man and I am a feminist.
"I am a feminist"
"So you hate men?"
"No...I think women should have equal rights as men"
"WTF YOU SLAG CUNT FACED-BITCH"
"...I still believe in feminism."
by Oh-I'm-Sorry-Am-I-Threatening? January 14, 2010

Free Daily Email

Type your email address below to get our free Urban Word of the Day every morning!

Emails are sent from daily@urbandictionary.com. We'll never spam you.

×