1 definition by TaxOurPie

Top Definition
A belief held by some Americans that everything is better in Europe. A play on the word 'utopia', only discernible in written form.
Low crime rates, good education, cultured citizens, and legal weed? Yeah, sounds like a real EUtopia.
by TaxOurPie August 20, 2010

Free Daily Email

Type your email address below to get our free Urban Word of the Day every morning!

Emails are sent from daily@urbandictionary.com. We'll never spam you.