Top Definition
The morally and fundamentally correct elitist belief that America is by far the single greatest nation in the history of the world.
"The fact that we have saved the world on multiple occasions from certain destruction is a testament to a true belief in Americentrism"
by James121 February 22, 2008
Free Daily Email

Type your email address below to get our free Urban Word of the Day every morning!

Emails are sent from We'll never spam you.