1 definition by Southern1987

Top Definition
In short, The South is a distinct region of the United States which consists of the same states that seceeded from the union in the pre-civil war era. If you don't know them by now, then you'll never know them. The South is growing at a more than moderate rate and is a lot more diversified than it used to be. The South is the epitomy of amiability but also maintains a great sense of pride. If someone outside of The South "disses" it, a severe ass-whippin can occur.
some idiot: You talk country.
me: What? You don't like my Southern drawl?
idiot: No. You sounded like a dumbass hick during the presentation.
me: I'm from The South, bitch. Plus, I'll be getting my Doctorate while you'll still be hunting for your Master's.
idiot: I don't care.
me: Me either. (thousands of Southerners show up at my defense)
idiot: We're all friends here........right?
by Southern1987 March 13, 2008

Free Daily Email

Type your email address below to get our free Urban Word of the Day every morning!

Emails are sent from daily@urbandictionary.com. We'll never spam you.