Calicentric

Calicentrism is a worldview that places California or Californian culture at the center of historical narratives and global events, often implying that Californian achievements and values are superior to those of other states and countries.

+6
That dude is calicentric.
by Boar bunny experience May 10, 2025
Get the Calicentric mug.