Feminism

Feminism is the belief that women are abused and mistreated in western society and are not treated as equals to men. when in reality it's just an outlet for women to bitch and complain about 3rd world problems that don't exist.
by Retard-daily February 13, 2017
Get the Feminism mug.