1 definition by leenaroseg

Feminism is the act of trying to bring the rights of women EQUAL to the rights of men.

This means getting paid equally, having the same education/ job opportunities, being taken more seriously, etc.
Men: *grope random female's ass*
Feminists: We need to teach men and young boys that this is wrong.
Men: OMG!! FEMINISM? MORE LIKE CANCER!..,!!, Feminists just want to control everything us men do, they want to destroy our rights, fuckkckkc feminists, (even tho without them, my mom would be unable to vote and my sister would be getting raped daily).
Men: *grope another random female's ass* hah men and women are equal there's no need for feminism...
by leenaroseg January 6, 2018
Get the Feminism mug.