1 definition by NIKK3333

Top Definition
The only western country that after WW2 didnt become an unofficial US state.
"Man...France sucks. The french dont like Bush, Hamburgers, fat people, ugly people, American english, capitalism, consumerism, huge SUVs and so many good things that improve our planet!"

" your right...Im going to germany, I heard after WW2, every girl bends over for an American or Russian !"
by NIKK3333 September 05, 2008

Free Daily Email

Type your email address below to get our free Urban Word of the Day every morning!

Emails are sent from daily@urbandictionary.com. We'll never spam you.

×