Skip to main content

Young feminist's definitions

Feminism

Feminism is the act of empowering women. Feminism isn’t about being above men, it’s about being equal to men. Many people say all feminists hate men which isn’t true. All feminists want is equal rights for women. Feminist don’t agree with gender roles and society’s perception of beauty.
The world does and will always need feminism.
by Young feminist May 23, 2018
mugGet the Feminism mug.

Share this definition

Sign in to vote

We'll email you a link to sign in instantly.

Or

Check your email

We sent a link to

Open your email