1 definition by blowwwwwwwwwwwwwwwwwwwwwwjob

Top Definition
Commonly refers to a set of countries that are pro-dominantly white (although blacks and other ethnicities have equal rights in these countries), rich and democratic. These countries have high standards of living and education, human rights, enough to eat, and so on.

These countries are also very attractive to those who live in the Third or Second World thanks to their prosperous economies and their opportunities, so most Western Countries tend to have strong immigration laws thanks to its magnetism. Most are also-English speaking.

Most Western Countries also posess a powerful military that is capable of protecting their borders from unwanted attacks, and all are allied with eachother in some form of another.

Countries that are considered "western countries" include:

United States of America
United Kingdom
New Zealand
Most other European Union countries
If you were born and live in any of the Western Countries, consider yourself lucky and never take your country for granted. Support the troops that defend your nation, and if you don't like it, get out, and go live in Africa or South America a while and then see if you're still bitching.

by blowwwwwwwwwwwwwwwwwwwwwwjob August 05, 2009

The Urban Dictionary Mug

One side has the word, one side has the definition. Microwave and dishwasher safe. Lotsa space for your liquids.

Buy the mug