Top Definition
The United States began as a largely rural nation, with most people living on farms or in small towns and villages. The rural life is perfect for peace and privacy, and for doing drugs.
Many Americans today still prefer the rural life.
by Elixir of Wisdom July 08, 2015
Free Daily Email

Type your email address below to get our free Urban Word of the Day every morning!

Emails are sent from daily@urbandictionary.com. We'll never spam you.

×