1 definition by hdag34

1) france is a country in western europe that does not speak german and continues to have is own culture thanks to the americans

2) france is a country of defensless people who think they are great until they need help

3) france sucks
i live in france so im a pussy!
by hdag34 September 2, 2009
Get the france mug.