1 definition by OrbitalCat

Top Definition
The States that boarders the Pacific Ocean. However, when most people/ celebrities say "West Coast", they usually refer to California.

The West Coast is the place you want to be.

: "The West Coast is the place to be."

Nuff said.
by OrbitalCat August 07, 2009

Free Daily Email

Type your email address below to get our free Urban Word of the Day every morning!

Emails are sent from daily@urbandictionary.com. We'll never spam you.