FEminism

Feminism is the belief that men and women should have equal rights. It is commonly confused with women who believe they are better or above men but that is not feminism then.
I Believe in feminism and it is not right that he/she is getting paid more/less than me!
by Lexilu6456 May 9, 2018
mugGet the FEminismmug.

Share this definition