Americentrism

The morally and fundamentally correct elitist belief that America is by far the single greatest nation in the history of the world.
"The fact that we have saved the world on multiple occasions from certain destruction is a testament to a true belief in Americentrism"
by James121 February 22, 2008
Get the Americentrism mug.