Whitedom: the all encompassing system of control and world dominion created by Euro-westerners from slave principals that is encoded in certain language and supported by "scientific" empirical evidence of the rule of the superior "white race" over all "inferior races". Whitedom supersedes kingdom in that rule has nothing to do with royalty but shared inheritance.
The west is definitely a Whitedom.
by Disciple52 April 13, 2013
The Urban Dictionary Mug
Can you define these popular missing words?