Top definition
The morally and fundamentally correct elitist belief that America is by far the single greatest nation in the history of the world.
"The fact that we have saved the world on multiple occasions from certain destruction is a testament to a true belief in Americentrism"
by James121 February 22, 2008
Mug icon

Donkey Punch Plush

10" high plush doll.

Buy the plush