1 definition by NIKK3333

The only western country that after WW2 didnt become an unofficial US state.
"Man...France sucks. The french dont like Bush, Hamburgers, fat people, ugly people, American english, capitalism, consumerism, huge SUVs and so many good things that improve our planet!"

" your right...Im going to germany, I heard after WW2, every girl bends over for an American or Russian !"
by NIKK3333 September 5, 2008
Get the France mug.