Skip to main content

Definitions by Bettythefirst

Vagicentrism 

Vagicentrism is the ideology that the vaginal sexual organ should be the central element in the organization of the social world.
"That cave is so vagicentric!"
"That new housing proposal for Vancouver is very reflective of the vagicentrism movement."
Vagicentrism by Bettythefirst March 3, 2021