A slightly derogatory word for America's west coast, used by Republicans to refer to the primarily Democratic California, Oregon, and Washington.
Arnie must feel very alone on the left coast.
by Economike August 09, 2004
Top Definition
Used to describe the obvious side of the map by those who live on the opposite. Otherwise known as the West Coast.
"Hey man, where you from?"
"I'm from Washington."
"No, on the Left Coast."
by Kamikazileo March 04, 2009

Free Daily Email

Type your email address below to get our free Urban Word of the Day every morning!

Emails are sent from daily@urbandictionary.com. We'll never spam you.