1 definition by Thomas S

Top Definition
The United States of America is the greatest country in the world. We have worked tirelessly for many years to help hundreds of other countries and when one idiot gets into an office of power everybody immediately turns on us.
Americans: Man I love the United States of America. They are a true symbol of freedom.
World: Shut up you american pig-dogs you helped us for many years but now we spit on you because we are so ignorant.
by Thomas S October 15, 2007

Free Daily Email

Type your email address below to get our free Urban Word of the Day every morning!

Emails are sent from daily@urbandictionary.com. We'll never spam you.

×