1 definition by hdag34

Top Definition
1) france is a country in western europe that does not speak german and continues to have is own culture thanks to the americans

2) france is a country of defensless people who think they are great until they need help

3) france sucks
i live in france so im a pussy!
by hdag34 September 02, 2009

Free Daily Email

Type your email address below to get our free Urban Word of the Day every morning!

Emails are sent from daily@urbandictionary.com. We'll never spam you.

×