Top definition
Whitedom: the all encompassing system of control and world dominion created by Euro-westerners from slave principals that is encoded in certain language and supported by "scientific" empirical evidence of the rule of the superior "white race" over all "inferior races". Whitedom supersedes kingdom in that rule has nothing to do with royalty but shared inheritance.
The west is definitely a Whitedom.
by Disciple52 April 13, 2013
Mug icon

Golden Shower Plush

He's warmer than you think.

Buy the plush